Algorithmic governance in environmental information (or how technophilia shape environmental democracy)

These are the slides from my talk at the Algorithmic Governance workshop (for which there are lengthy notes in the previous post). The workshop explored the many ethical, legal and conceptual issues with the transition to Big Data and algorithm based decision-making.

My contribution to the discussion is based on previous thoughts on environmental information and public use of it. Inherently, I see the relationships between environmental decision-making, information, and information systems as something that need to be examined through the prism of the long history that linked them. This way we can make sense of the current trends. This three area are deeply linked throughout the history of the modern environmental movement since the 1960s (hence the Apollo 8 earth image at the beginning),  and the Christmas message from the team with the reference to Genesis (see below) helped in making the message stronger .

To demonstrate the way this triplet evolved, I’m using texts from official documents – Stockholm 1972 declaration, Rio 1992 Agenda 21, etc. They are fairly consistent in their belief in the power of information systems in solving environmental challenges. The core aspects of environmental technophilia are summarised in slide 10.

This leads to environmental democracy principles (slide 11) and the assumptions behind them (slide 12). While information is open, it doesn’t mean that it’s useful or accessible to members of the public. This was true when raw air monitoring observations were released as open data in 1997 (before anyone knew the term), and although we have better tools (e.g. Google Earth) there are consistent challenges in making information meaningful – what do you do with Environment Agency DSM if you don’t know what it is or how to use a GIS? How do you interpret Global Forest Watch analysis about change in tree cover in your area if you are not used to interpreting remote sensing data (a big data analysis and algorithmic governance example)? I therefore return to the hierarchy of technical knowledge and ability to use information (in slide 20) that I covered in the ‘Neogeography and the delusion of democratisation‘ and look at how the opportunities and barriers changed over the years in slide 21.

The last slides show that despite of all the technical advancement, we can have situations such as the water contamination in Flint, Michigan which demonstrate that some of the problems from the 1960s that were supposed to be solved, well monitored, with clear regulations and processes came back because of negligence and lack of appropriate governance. This is not going to be solved with information systems, although citizen science have a role to play to deal with the governmental failure. This whole sorry mess and the re-emergence of air quality as a Western world environmental problem is a topic for another discussion…

Algorithmic Governance Workshop (NUI Galway)

Algorithmic Governance Workshop (source: Niall O Brolchain)

The workshop ‘Algorithmic Governance’ was organised as an intensive one day discussion and research needs development. As the organisers Dr John Danaher
and Dr Rónán Kennedy identified:

‘The past decade has seen an explosion in big data analytics and the use  of algorithm-based systems to assist, supplement, or replace human decision-making. This is true in private industry and in public governance. It includes, for example, the use of algorithms in healthcare policy and treatment, in identifying potential tax cheats, and in stopping terrorist plotters. Such systems are attractive in light of the increasing complexity and interconnectedness of society; the general ubiquity and efficiency of ‘smart’ technology, sometimes known as the ‘Internet of Things’; and the cutbacks to government services post-2008.
This trend towards algorithmic governance poses a number of unique challenges to effective and legitimate public-bureaucratic decision-making. Although many are already concerned about the threat to privacy, there is more at stake in the rise of algorithmic governance than this right alone. Algorithms are step-by-step computer coded instructions for taking some input (e.g. tax return/financial data), processing it, and converting it into an output (e.g. recommendation for audit). When algorithms are used to supplement or replace public decision-making, political values and policies have to be translated into computer code. The coders and designers are given a set of instructions (a project ‘spec’) to guide them in this process, but such project specs are often vague and underspecified. Programmers exercise considerable autonomy when translating these requirements into code. The difficulty is that most programmers are unaware of the values and biases that can feed into this process and fail to consider how those values and biases can manifest themselves in practice, invisibly undermining fundamental rights. This is compounded by the fact that ethics and law are not part of the training of most programmers. Indeed, many view the technology as a value-neutral tool. They consequently ignore the ethical ‘gap’ between policy and code. This workshop will bring together an interdisciplinary group of scholars and experts to address the ethical gap between policy and code.

The workshop was structured around 3 sessions of short presentations of about 12 minutes, with an immediate discussion, and then a workshop to develop research ideas emerging from the sessions. This very long post are my notes from the meeting. These are my takes, not necessarily those of the presenters. For another summery of the day, check John Danaher’s blog post.

Session 1: Perspective on Algorithmic Governance

Professor Willie Golden (NUI Galway)Algorithmic governance: Old or New Problem?’ focused on an information science perspective.  We need to consider the history – an RO Mason paper from 1971 already questioned the balance between the decision-making that should be done by humans, and that part that need to be done by the system. The issue is the level of assumptions that are being integrated into the information system. Today the amount of data that is being collected and the assumption on what it does in the world is a growing one, but we need to remain sceptical at the value of the actionable information. Algorithms needs managers too. Davenport in HBR 2013 pointed that the questions by decision makers before and after the processing are critical to effective use of data analysis systems. In addition, people are very concerned about data – we’re complicit in handing over a lot of data as consumers and the Internet of Things (IoT) will reveal much more. Debra Estrin 2014 at CACM provided a viewpoint – small data, where n = me where she highlighted the importance of health information that the monitoring of personal information can provide baseline on you. However, this information can be handed over to health insurance companies and the question is what control you have over it. Another aspect is Artificial Intelligence – Turing in 1950’s brought the famous ‘Turing test’ to test for AI. In the past 3-4 years, it became much more visible. The difference is that AI learn, which bring the question how you can monitor a thing that learn and change over time get better. AI doesn’t have self-awareness as Davenport 2015 noted in Just How Smart are Smart Machines and arguments that machine can be more accurate than humans in analysing images. We may need to be more proactive than we used to be.

Dr Kalpana Shankar (UCD), ‘Algorithmic Governance – and the
Death of Governance?’ focused on digital curation/data sustainability and implication for governance. We invest in data curation as a socio-technical practice, but need to explore what it does and how effective are current practices. What are the implications if we don’t do ‘data labour’ to maintain it, to avoid ‘data tumbleweed. We are selecting data sets and preserving them for the short and long term. There is an assumption that ‘data is there’ and that it doesn’t need special attention. Choices that people make to preserve data sets will influence the patterns of  what appear later and directions of research. Downstream, there are all sort of business arrangement to make data available and the preserving of data – the decisions shape disciplines and discourses around it – for example, preserving census data influenced many of the social sciences and direct them towards certain types of questions. Data archives influenced the social science disciplines – e.g. using large data set and dismissing ethnographic and quantitative data. The governance of data institutions need to get into and how that influence that information that is stored and share. What is the role of curating data when data become open is another question. Example for the complexity is provided in a study of a system for ‘match making’ of refugees to mentors which is used by an NGO, when the system is from 2006, and the update of job classification is from 2011, but the organisation that use the system cannot afford updating and there is impacts on those who are influenced by the system.

Professor John Morison (QUB), ‘Algorithmic Governmentality’. From law perspective, there is an issue of techno-optimism. He is interested in e-participation and participation in government. There are issue of open and big data, where we are given a vision of open and accountable government and growth in democratisation – e.g. social media revolution, or opening government through data. We see fantasy of abundance, and there are also new feedback loops – technological solutionism to problems in politics with technical fixes. Simplistic solutions to complex issues. For example, an expectation that in research into cybersecurity, there are expectations of creating code as a scholarly output. Big Data have different creators (from Google to national security bodies) and they don’t have the same goals. There is also issues of technological authoritarianism as a tool of control. Algorithmic governance require to engage in epistemology, ontology or governance. We need to consider the impact of democracy – the AI approach is arguing for the democratisation through N=all argument. Leaving aside the ability to ingest all the data, what is seemed to assume that subjects are not viewed any more as individuals but as aggregate that can be manipulated and act upon. Algorithmic governance, there is a false emancipation by promise of inclusiveness, but instead it is responding to predictions that are created from data analysis. The analysis is arguing to be scientific way to respond to social needs. Ideas of individual agency disappear. Here we can use Foucault analysis of power to understand agency.  Finally we also see government without politics – arguing that we make subjects and objects amenable to action. There is not selfness, but just a group prediction. This transcend and obviates many aspects of citizenship.

Niall O’Brolchain (Insight Centre), ‘The Open Government’. There is difference between government and governance. The eGov unit in Galway Insight Centre of Data Analytics act as an Open Data Institute node and part of the Open Government Partnership. OGP involve 66 countries, to promote transparency, empower citizens, fight corruption, harness new technologies to strengthen governance. Started in 2011 and involved now 1500 people, with ministerial level involvement. The OGP got set of principles, with eligibility criteria that involve civic society and government in equal terms – the aim is to provide information so it increase civic participation, requires the highest standards of professional integrity throughout administration, and there is a need to increase access to new technologies for openness and accountability. Generally consider that technology benefits outweigh the disadvantages for citizenship. Grand challenges – improving public services, increasing public integrity, public resources, safer communities, corporate accountability. Not surprisingly, corporate accountability is one of the weakest.

Discussion:

Using the Foucault framework, the question is about the potential for resistance that is created because of the power increase. There are cases to discuss about hacktivism and use of technologies. There is an issue of the ability of resisting power – e.g. passing details between companies based on prediction. The issue is not about who use the data and how they control it. Sometime need to use approaches that are being used by illegal actors to hide their tracks to resist it.
A challenge to the workshop is that the area is so wide, and we need to focus on specific aspects – e.g. use of systems in governments, and while technology is changing. Interoperability.  There are overlaps between environmental democracy and open data, with many similar actors – and with much more government buy-in from government and officials. There was also technological change that make it easier for government (e.g. Mexico releasing environmental data under OGP).
Sovereignty is also an issue – with loss of it to technology and corporations over the last years, and indeed the corporate accountability is noted in the OGP framework as one that need more attention.
There is also an issue about information that is not allowed to exists, absences and silences are important. There are issues of consent – the network effects prevent options of consent, and therefore society and academics can force businesses to behave socially in a specific way. Keeping of information and attributing it to individuals is the crux of the matter and where governance should come in. You have to communicate over the internet about who you are, but that doesn’t mean that we can’t dictate to corporations what they are allowed to do and how to use it. We can also consider of privacy by design.

Session 2: Algorithmic Governance and the State

Dr Brendan Flynn (NUI Galway), ‘When Big Data Meets Artificial Intelligence will Governance by Algorithm be More or Less Likely to Go to War?’. When looking at autonomous weapons we can learn about general algorithmic governance. Algorithmic decision support systems have a role to play in very narrow scope – to do what the stock market do – identifying very dangerous response quickly and stop them. In terms of politics – many things will continue. One thing that come from military systems is that there are always ‘human in the loop’ – that is sometime the problem. There will be HCI issues with making decisions quickly based on algorithms and things can go very wrong. There are false positive cases as the example of the USS Vincennes that uses DSS to make a decision on shooting down a passenger plane. The decision taking is limited by the decision shaping, which is handed more and more to algorithms. There are issues with the way military practices understand command responsibility in the Navy, which put very high standard from responsibility of failure. There is need to see how to interpret information from black boxes on false positives and false negatives. We can use this extreme example to learn about civic cases. Need to have high standards for officials. If we do visit some version of command responsibility to those who are using algorithms in governance, it is possible to put responsibility not on the user of the algorithm and not only on the creators of the code.

Dr Maria Murphy (Maynooth), ‘Algorithmic Surveillance: True
Negatives’. We all know that algorithmic interrogation of data for crime prevention is becoming commonplace and also in companies. We know that decisions can be about life and death. When considering surveillance, there are many issues. Consider the probability of assuming someone to be potential terrorist or extremist. In Human Rights we can use the concept of private life, and algorithmic processing can challenge that. Article 8 of the Human Right Convention is not absolute, and can be changed in specific cases – and the ECHR ask for justifications from governments, to show that they follow the guidelines. Surveillance regulations need to explicitly identify types of people and crimes that are open to observations. You can’t say that everyone is open to surveillance. When there are specific keywords that can be judged, but what about AI and machine learning, where the creator can’t know what will come out? There is also need to show proportionality to prevent social harm. False positives in algorithms – because terrorism are so rare, there is a lot of risk to have a bad impact on the prevention of terrorism or crime. The assumption of more data is better data, we left with a problem of generalised surveillance that is seen as highly problematic. Interestingly the ECHR do see a lot of potential in technologies and their potential use by technologies.

Professor Dag Weise Schartum (University of Oslo), ‘Transformation of Law into Algorithm’. His focus was on how algorithms are created, and thinking about this within government systems. They are the bedrock of our welfare systems – which is the way they appear in law. Algorithms are a form of decision-making: general decisions about what should be regarded, and then making decisions. The translation of decisions to computer code, but the raw material is legal decision-making process and transform them to algorithms. Programmers do have autonomy when translating requirements into code – the Norwegian experience show close work with experts to implement the code. You can think of an ideal transformation model of a system to algorithms, that exist within a domain – service or authority of a government, and done for the purpose of addressing decision-making. The process is qualification of legal sources, and interpretations that are done in natural language, which then turn into specification of rules, and then it turns into a formal language which are then used for programming and modelling it. There are iterations throughout the process, and the system is being tested, go through a process of confirming the specification and then it get into use. It’s too complex to test every aspect of it, but once the specifications are confirmed, it is used for decision-making.  In terms of research we need to understand the transformation process in different agency – overall organisation, model of system development, competences, and degree of law-making effects. The challenge is the need to reform of the system: adapting to changes in the political and social change over the time. Need to make the system flexible in the design to allow openness and not rigidness.

Heike Felzman (NUI Galway), ‘The Imputation of Mental Health
from Social Media Contributions’ philosophy and psychological background. Algorithms can access different sources – blogs, social media and this personal data are being used to analyse mood analysis, and that can lead to observations about mental health. In 2013, there are examples of identifying of affective disorders, and the research doesn’t consider the ethical implication. Data that is being used in content, individual metadata like time of online activities, length of contributions, typing speed. Also checking network characteristics and biosensing such as voice, facial expressions. Some ethical challenges include: contextual integrity (Nissenbaum 2004/2009) privacy expectations are context specific and not as constant rules. Secondly, lack of vulnerability protection – analysis of mental health breach the rights of people to protect their health. Third, potential negative consequences, with impacts on employment, insurance, etc. Finally, the irrelevance of consent – some studies included consent in the development, but what about applying it in the world. We see no informed consent, no opt-out, no content related vulnerability protections, no duty of care and risk mitigation, there is no feedback and the number of participants number is unlimited. All these are in contrast to practices in Human Subjects Research guidelines.

Discussion:

In terms of surveillance, we should think about self-surveillance in which the citizens are providing the details of surveillance yourself. Surveillance is not only negative – but modern approach are not only for negative reasons. There is hoarding mentality of the military-industrial complex.
The area of command responsibility received attention, with discussion of liability and different ways in which courts are treating military versus civilian responsibility.

Panel 3: Algorithmic Governance in Practice

Professor Burkhard Schafer (Edinburgh), ‘Exhibit A – Algorithms as
Evidence in Legal Fact Finding’. The discussion about legal aspects can easily go to 1066 – you can go through a whole history. There are many links to medieval law to today. As a regulatory tool, there is the issue with the rule of proof. Legal scholars don’t focus enough on the importance of evidence and how to understand it. Regulations of technology is not about the law but about the implementation on the ground, for example in the case of data protection legislations. In a recent NESTA meeting, there was a discussion about the implications of Big Data – using personal data is not the only issue. For example, citizen science project that show low exposure to emission, and therefore deciding that it’s relevant to use the location in which the citizens monitored their area as the perfect location for a polluting activity – so harming the person who collected data. This is not a case of data protection strictly. How can citizen can object to ‘computer say no’ syndrome? What are the minimum criteria to challenge such a decision? What are the procedural rules of fairness. Have a meaningful cross examination during such cases is difficult in such cases. Courts sometimes accept and happy to use computer models, and other times reluctant to take them. There are issues about the burden of proof from systems (e.g. to show that ATM was working correctly when a fraud was done). DNA tests are relying on computer modelling, but systems that are proprietary and closed. Many algorithms are hidden for business confidentiality and there are explorations of these issues. One approach is to rely on open source tools. Replication is another way of ensuring the results. Escrow ownership of model by third party is another option. Next, there is a possibility to questioning software, in natural language.

Dr Aisling de Paor (DCU), ‘Algorithmic Governance and Genetic Information’ – there is an issue in law, and massive applications in genetic information. There is rapid technological advancement in many settings, genetic testing, pharma and many other aspects – indications of behavioural traits, disability, and more. There are competing rights and interests. There are rapid advances in this area – use in health care, and the technology become cheaper (already below $1000). Genetic information. In commercial settings use in insurance, valuable for economic and efficiency in medical settings. There is also focus on personalised medicine. A lot of the concerns are about misuse of algorithms. For example, the predictive assumption about impact on behaviour and health. The current state of predictability is limited, especially the environmental impacts on expressions of genes. There is conflicting rights – efficiency and economic benefits but challenge against human rights – e.g. right to privacy . Also right for non-discrimination – making decisions on the basis of probability may be deemed as discriminatory. There are wider societal and public policy concerns – possible creation of genetic underclass and the potential of exacerbate societal stigma about disability, disease and difference. Need to identify gaps between low, policy and code, decide use, commercial interests and the potential abuses.

Anthony Behan (IBM but at a personal capacity), ‘Ad Tech, Big Data and Prediction Markets: The Value of Probability’. Thinking about advertising, it is very useful use case to consider what happen in such governance processes. What happen in 200 milliseconds for advertising, which is the standards on the internet. The process of real-time-bid is becoming standardised. Start from a click – the publisher invokes an API and give information about the interactions from the user based on their cookie and there are various IDs. Supply Side Platform open an auction. on the demand side, there are advertisers that want to push content to people – age group, demographic, day, time and objectives such as click through rates. The Demand Side platform looks at the SSPs. Each SSP is connected to hundreds of Demand Side Platforms (DSPs). Complex relationships exist between these systems. There are probability score or engage in a way that they want to engage, and they offer how much it is worth for them – all in micropayment. The data management platform (DMP) is important to improve the bidding. e.g., if they can get information about users/platform/context at specific times places etc is important to guess how people tend to behave. The economy of the internet on advert is based on this structure. We get abstractions of intent – the more privacy was invaded and understand personality and intent, the less they were interested in a specific person but more in the probability and the aggregate. Viewing people as current identity and current intent, and it’s all about mathematics – there are huge amount of transactions, and the inventory become more valuable. The interactions become more diverse with the Internet of Things. The Internet become a ‘data farm’ – we started with a concept that people are valuable, to view that data is valuable and how we can extract it from people. Advertising goes into the whole commerce element.

I’ll blog about my talk ‘Algorithmic Governance in Environmental Information (or How Technophilia Shapes Environmental Democracy) later.

 Discussion:

There are issues with genetics and eugenics. Eugenics fell out of favour because of science issues, and the new genetics is claiming much more predictive power. In neuroscience there are issues about brain scans, which are not handled which are based on insufficient scientific evidence. There is an issue with discrimination – shouldn’t assume that it’s only negative. Need to think about unjustified discrimination. There are different semantic to the word. There are issues with institutional information infrastructure.

Environmental information: between scarcity/abundance and emotions/rationality

The Eye on Earth Summit, which was held in Abu Dhabi last week, allowed me to immerse myself in the topics that I’ve been researching for a long time: geographic information, public access to environmental information, participation, citizen science, and the role of all these in policy making. My notes (day 1 morning, day 1 afternoon, day 2 morning, day 2 afternoon, day 3 morning & day 3 afternoon) provide the background for this post, as well as the blog posts from Elisabeth Tyson (day 1, day 2) and the IISD reports and bulletins from the summit. The first Eye on Earth Summit provided me with plenty to think about, so I thought that it is worth reflecting on my ‘Take home’ messages.

What follows are my personal reflections from the summit and the themes that I feel are emerging in the area of environmental information today. 

wpid-wp-1444166132788.jpgWhen considering the recent ratification of the Sustainable Development Goals or SDGs by the UN Assembly, it is not surprising that they loomed large over the summit – as drivers for environmental information demand for the next 15 years, as focal points for the effort of coordination of information collection and dissemination, but also as an opportunity to make new links between environment and health, or promoting environmental democracy (access to information, participation in decision making, and access to justice). It seems that the SDGs are very much in the front of the mind of the international organisations who are part of the Eye on Earth alliance, although other organisations, companies and researchers who are coming with more technical focus (e.g. Big Data or Remote Sensing) are less aware of them – at least in terms of referring to them in their presentations during the summit.

Beyond the SDGs, two overarching tensions emerged throughout the presentations and discussions – and both are challenging. They are the tensions between abundance and scarcity, and between emotions and rationality. Let’s look at them in turn.

Abundance and scarcity came up again and agin. On the data side, the themes of ‘data revolution’, more satellite information, crowdsourcing from many thousands of weather observers and the creation of more sources of information (e.g. Environmental Democracy Index) are all examples for abundance in the amount of available data and information. At the same time, this was contrasted with the scarcity in the real world (e.g species extinction, health of mangroves), scarcity of actionable knowledge, and scarcity with ecologists with computing skills. Some speakers oscillated between these two ends within few slides or even in the same one. There wasn’t an easy resolution for this tension, and both ends were presented as challenges.

wpid-wp-1444327727288.jpg

With emotions and scientific rationality, the story was different. Here the conference was packed with examples that we’re (finally!) moving away from a simplistic ‘information deficit model‘ that emphasise scientific rationality as the main way to lead a change in policy or public understanding of environmental change. Throughout the summit presenters emphasised the role of mass media communication, art (including live painting development through the summit by GRID-Arendal team), music, visualisation, and story telling as vital ingredients that make information and knowledge relevant and actionable. Instead of a ‘Two Cultures’ position, Eye on Earth offered a much more harmonious and collaborative linkage between these two ways of thinking and feeling.

Next, and linked to the issue of abundance and scarcity are costs and funding. Many talks demonstrated the value of open data and the need to provide open, free and accessible information if we want to see environmental information used effectively. Moreover, providing the information with the ability of analyse or visualise it over the web was offered as a way to make it more powerful. However, the systems are costly, and although the assessment of the IUCN demonstrated that the investment in environmental datasets is modest compared to other sources (and the same is true for citizen science), there are no sustainable, consistent and appropriate funding mechanisms, yet. Funding infrastructure or networking activities is also challenging, as funders accept the value, but are not willing to fund them in a sustainable way. More generally, there is an issue about the need to fund ecological and environmental studies – it seem that while ‘established science’ is busy with ‘Big Science’ – satellites, Big Data, complex computer modelling – the work of studying ecosystems in an holistic way is left to small group of dedicated researchers and to volunteers. The urgency ad speed of environmental change demand better funding for these areas and activities.

This lead us to the issue of Citizen Science, for which the good news are that it was mentioned throughout the summit, gaining more prominence than 4 years ago in the first summit (were it also received attention). In all plenary sessions, citizen science or corwdsourced geographic information were mentioned at least once, and frequently by several speakers. Example include Hermes project for recording ocean temperatures, Airscapes Singapore for urban air quality monitoring, the Weather Underground of sharing weather information, Humanitarian OpenStreetMap Team work in Malawi, Kathmandu Living Lab response to the earthquake in Nepal, Arab Youth Climate Movement in Bahrain use of iNaturalist to record ecological observations, Jacky Judas work with volunteers to monitor dragonflies in Wadi Wurayah National Park  – and many more. Also the summit outcomes document is clear:  “The Summit highlighted the role of citizen science groups in supporting governments to fill data gaps, particularly across the environmental and social dimensions of sustainable development. Citizen Science was a major focus area within the Summit agenda and there was general consensus that reporting against SDGs must include citizen science data. To this end, a global coalition of citizen science groups will be established by the relevant actors and the Eye on Earth Alliance will continue to engage citizen science groups so that new data can be generated in areas where gaps are evident. The importance of citizen engagement in decision-making processes was also highlighted. ”

However, there was ambivalence about it – should it be seen as an instrument, a tool to produce environmental information or as a mean to get wider awareness and engagement by informed citizens? How best to achieve the multiple goals of citizen science: raising awareness, educating, providing skills well beyond the specific topic of the project, and democratising decision making and participation? It seem to still be the case that the integration of citizen science into day to day operations is challenging for many of the international organisations that are involved in the Eye on Earth alliance.

Another area of challenging interactions emerged from the need for wide partnerships between governments, international organisations, Non-Governmental Organisations (NGOs), companies, start-ups, and even ad-hoc crowds that respond to a specific event or an issue which are afforded by digital and social network. There are very different speeds in implementation and delivery between these bodies, and in some cases there are chasms that need to be explored – for example, an undercurrent from some technology startups is that governments are irrelevant and in some forms of thinking that ‘to move fast and break things’ – including existing social contracts and practices – is OK. It was somewhat surprising to hear speakers praising Uber or AirBnB, especially when they came from people who familiar with the need for careful negotiations that take into account wider goals and objectives. I can see the wish to move things faster – but to what risks to we bring by breaking things?

With the discussions about Rio Principle 10 and the new developments in Latin America, the Environmental Democracy Index, and the rest, I became more convinced, as I’ve noted in 2011, that we need to start thinking about adding another right to the three that are included in it (access to environmental information, participation in decision-making, and access to justice), and develop a right to produce environmental information that will be taken seriously by the authorities – in other words, a right for citizen science. I was somewhat surprised by the responses when I raised this point during the discussion on Principle 10.

Final panel (source: IISD)

Finally, Eye on Earth was inclusive and collaborative, and it was a pleasure to see how open people were to discuss issues and explore new connections, points of view or new ways of thinking about issues. A special point that raised several positive responses was the gender representation in such high level international conference with a fairly technical focus (see the image of the closing panel). The composition of the speakers in the summit, and the fact that it was possible to have such level of women representation was fantastic to experience (making one of the male-only panels on the last day odd!). It is also an important lesson for many academic conferences – if Eye on Earth can, I cannot see a reason why it is not possible elsewhere.

Eye on Earth (Day 3 – Afternoon) Remote sensing, conservation monitoring and closing remarks

The afternoon of the last day of Eye on Earth included two plenary sessions, and a discussion (for the morning, see this post). The first plenary focused on Remote sensing and location enabling applications:

wpid-wp-1444340329759.jpgTaner Kodanaz (digitalglobe) technology that looking out to the sky now allow us to look at the Earth from 400 miles. Digital Global started 14 years with high-resolution satellite imagery – with billions of users a day that rely on online map. In natural disasters, they provide information that helped responding to it. Some examples of accelerating efforts include forest fire, intentional fires – in Global Forest Watch, Digital Globe data is used to monitor fire and deforestation and address it. The work WRI led Indonesia to deal with forest fire. Also showing the Missing Maps and respond to Kathmandu earthquake and other cases.

Anil Kumar (Environment Agency – Abu Dhabi) Abu Dhabi have done conservation effort for a long time. They have special interesting Houbara, Falcons, Scimitar Horn Oryx and several other species. Abu Dhabi was doing wildlife tracking 20 years ago, use satellite tracking to give insights into migratory routes and stopovers to reach agreement about avoiding their hunting during migration, and they’ve done different patterns of use. They also done Habitat mapping using satellite information with field verification checking that the classification works. Local ability to create classification of different habitats made it possible to share it, digitally and on paper. Allow protecting areas, follow national and international obligations, improve governance and even for emergency response and accurate blue carbon information. They also map local forestation. They have an environmental portal and share the information.

wpid-wp-1444340353887.jpgLian Pin Koh (Conservation Drones) the idea to have be able to monitor nests of Orang-utan which are difficult to monitor from the ground. Because commercial drones are expensive, he was involved in creating a DIY drone in 2012, based on toy plane and programme the route, with simple camera. This enable them to create attention from conservation groups and community scientists. Conservation Drones started as a project and done many places. They have manage to use it for a wide range of projects and shared their experience. The drone is cheap – $700 and allow repeat monitoring, and also identifying illegal logging. Reaching 1-2 cm resolution. Also used in disaster relief in a case of flood from a busted dam that happened during forest monitoring. Attitude to ConservationDrones.org changed rapidly, from ridicule to excitement, and now they are involved in exploring mapping how to quantify biomas – fuel load and control burns. The issues about drones is to create actionable information.

wpid-wp-1444340364530.jpgJustin Saunders (eMapsite) – Malawi experience an incredible rainfall, with 200,000 displaced. Rapid response don’t happen until it reach the news – but it didn’t received much attention. They received radar imagery. They used the UN Charter to gain access to the radar imagery that helped to respond to the places that were flooded. They could see the inundation, and also use a flood model to see how realistic was it. Climate change exceeded all the assumptions – including one in 500 years. In Malawi, there isn’t information about the building and community assets. They have worked with OpenStreetMap, carrying out community mapping following the practices of Open Cities and this allow the support of many relief organisations – supporting. Also used the Masdap.mw system that is the Malawi Spatial Data Portal (based on open source) and that allow sharing information. Only one platform help to ensure sharing. Use crowdsourcing before, during and after the event – they are aware that with climate change it will exceed historical records. Use of open source software encourage people to train, and improved the flood modelling. Institutions take new technology, data and methodology rapidly – especially when it was free and not require investment. Visualisation helped action.

Steven Ramage (What3Words) – there are 135 countries that don’t have addressing information, and the Universal Postal Union, this is very valuable. There are four billion people without location reference. Allow creating a digital location reference in 3 words in places that are informal and don’t have addressing system. There are 860,000 people in informal settlements – how do we communicate the location. Instead of lat/long but when you need to communicate between people, creating 3 words key to the place. The system is small – 10MB and can work without connectivity, and there is research that demonstrate that words are easier to remember then numbers. Long words are to less populated area and there is new dictionary for each language, enabling to integrate into indigenous languages. Started to be used by esri, nestoria, UN, Safe Software, Mapillary, GoCarShare. Used in the Nepal earthquake, in delivery of medicine in informal settlement, UNOCHA suggest using what3words.

The final set of talks was titled Feet in the field chaired by Stuart Parerson (Conservation Leadership Programme) exploring volunteering programmes. He noted that the questions for the session were: How do we build capacity to collect primary data? How do we make people future conservation leaders? How do we communicate with policy makers? The Feet in the Field is aimed to support future conservation leaders. They have 6 key stages process of identifying and promoting young leaders. There i a need of investment and attention to maintain diversity.

David Kuria (KENVO) Kijabe Environment volunteers – explore conservation and livelihood – founded in 1994 in Kikuyu Escarpment Forest. They do education but also community empowerment. They observed forest degradation – illegal logging, over grazing and also breakdown of social systems. Knowledge and skills that gained locally and through NGOs, and then use that to mobilise the community, lobbying, but also patrolling and monitoring. They have done different studies – poaching, bird surveys, forest monitoring, as well as climate change and carbon trading. The data is used to action – e.g. encouraging ecotourism, or capacity building of many farmers. Data is important for decision makers and a strong tool for conservation awareness – and fosters support. But more important is the human side – good leadership, motivation and engagement, respecting existing systems, owned by stakeholders, working with marginalised groups. Many challenges: technical capacity, resources, high turn over of government staff, limited ability in volunteering, vast area and more.

Alberto Campos (Aquasis in Brazil) – 21 years preventing extinction in Brazil – based in Fortaleza, and they look after highly endangered marine mammals and birds. The have emergency plan and action plan – to do that they need long term plan. The problem is that they need long team funding, conservation & fieldwork training – and they been receiving support from the CLP). Systems that they developed been adopted by the government. Communicating these results is shifting focus for conservation of species to the resources they help to conserve. Biodiversity conservation is opening other resource – Manakin is becoming indicator to clean and accessible water – and that help to recognise them

Ayesha Yousef Al Blooshi (Marine Biodiversity at EAAD) – primary producers of environmental data, EAD produce data, then pass it to environmental management sector, that is use by government and then share it with the world. They been monitoring corals undersea and take photo transects that are analysed – it’s a very manual process that take a lot of time. They think about using CoralNet that use machine learning to recognised species. The sea grass is supporting the population of Dugongs, and monitor them from the air. They also track them and use drone technology to monitor dolphins. They have a collector app that allow them to record different sightings which speed up and simplify data collection. They also gather traditional knowledge from fishermen – also looking at the past and capture wealth of data.

Nicolas Heard – funds from the Mohamed bin Zayed conservation fund. They like people who are passionate about species. They can show how the small grant can be used to further the  cause of their species. The passion need to be matched with science – also important to pass on enthusiasm to local communities, but that is not enough. Need data, information, knowledge, skills and collaboration. They provide small grants for survey and monitoring and encourage contribution of data to other purposes. Help support outreach, prioritising conservation action, help in efficiency

Jacky Judas (Wadi Wuraya National Park) in the eastern coast of the UAE. The park was created in 2009 and made into RAMSAR site in 2010, aiming to develop management plan. The water research programme are education, awareness and scientific data. The participants learn about fresh water ecosystems and the challenges, and also learn how to monitor the ecosystem. 10-15 volunteers through EarthWatch, research activities include Toad monitoring – field data collection, lab experiment, data input. Also monitoring dragonflies (hot spot for them in the area)  and discovered a species that was never spotted in the UAE. Working with volunteers allow monitoring over the season, the use iNaturalist and help to GBIF

Jean-Christophe Vié (IUCN) have tradition of looking at primary data collection. Behind each assessment in the 70,000 species in the Red List, there is at least on person working on the ground. The created the habitat conservation programme allow them to support primary data collection. Species are good way to tell stories. Projects such as Save our Species help in understanding distribution of species and then identify key areas to provide support for conservation. They ask to have some monitoring information to understand what is the impact of investment.

Summary of the session: We need capacity of research; data must lead to action; show how species help to protect other resources; combine traditional and scientific knowledge; and realise that small funding can go long way with volunteers.

Once that part was completed, we moved to the summary of the summit. 

wpid-wp-1444327778438.jpgH.E. Razan Khalifa Al MubarakJacqueline McGlade, Barbara Ryan, Janet Ranganathan and Thomas Brooks .
Nima Abu-Wardeh, who moderated the whole summit, set questions to the panel: How do you all fit together? Razan: we find ways to fit together – regions are represented, there are many positive things happen in the Arab region and share them. Barbara: no one organisation can deal with environmental problem alone, the power is coming together from public, private and civil society – all need to work together, and there are challenges of changing our internal systems, bridge the transition from data to wisdom, we need to do that. Thomas: IUCN fit in to EoE through the power of the network of public bodies, 1000 civil society members, and more than 10000 experts, Janet: WRI – trying to scale things through counting and present it in an engaging way. Jacquie – what is important is to representing the UN family making poor and vulnerable heard. To address environmental problems, we need the Eye on Earth alliance, this is the way to reach out across the world. What are the tools and mechanisms that people need – how ‘how am I going to do it?’ is going to happen. Jacquie: provides a web intelligence information from UNEP Live, we can see how clusters of knowledge are being built up. Things are linked to other places across the world and letting citizens influence the agenda. Razan: need to synchronise elevator – one with policy makers that need the data and another one with scientists who are producing the data. We need to synchronicity that change in each region according to need.
wpid-wp-1444327783438.jpgPeople can completely bypass the system in many ways, but what happen if policy makers take too much time, and the needs are urgent – what will happen after the event? Jacquie: we are suggested activities that are dealing with foundational – global network or networks, environmental education, access for all and then link to thematic areas – biodiversity, disaster management, community sustainability and resilience, oceans and blue carbon, and water security. Barbara: the organisations that we are involved in – we need to think how our activities that already exist with identifying the themes. Thomas: IUCN can contribute the knowledge products to the range of Eye on Earth products, and advocate for mechanisms to develop capacity to generate data. Janet: contributing data platforms – resource watch, forest watch and on access for all. the Environment Democracy Index came out of EoE.
How do we do things better? there is much ground to cover and stimulating change. Barbara: for partnerships to work, it got to align with our own vision. The partnership let us do that. Advocacy for broad open data policies – we need to get on with it. Jacquie: we need to bring Principle 10 to the UN. We need to open up governmental debate, we grab participation by the neck and make it central to what we do. We have big environmental assembly. We need data that inform. Barbara: the capabilities of citizen science and citizen sensing was front and centre and that is central.

We need to talk with the media and behaviour change, broadening our horizons.

Razan – we converge and collaborate. We came from all regions of the world and walks of life. Some are affiliated with government, research, start up, companies, ecologists and environmentalists. Many here were here in 2011. Thanking for signaling the value in the eye on earth network. Developing a strong sense of community, aiming to solve major problems of the planet. We see sense of purpose in assisting the monitoring and progress towards the SDGs. We have 5 organisations that commit to be founding members of the organisations: AGEDI, IUCN, WRI, UNEP and GEO. They commit to develop assist and guide global community to achieve the SDGs. Eye on Earth can provide collective voice – it is informal alliance, and agreeing to convene Eye on Earth again.

Eye on Earth (Day 3 – Morning) – Enabling Conditions and access to information, participation & justice

Building on the themes of Data Demand (on the first day of the summit) and Data Supply (on the second day), the last day of the Eye on Earth Summit explored the enabling conditions that link producers and users of data.
wpid-wp-1444288732480.jpgBefore the first plenary, the World Resource Institute (WRI) launched The Environmental Democracy Index (EDI). Lalanth de Silva noted that the index rank countries according to Principle 10 pillars: access to environmental information, participation in decision making, and access to justice. The index was sent to governments since its Beta version release in May. The responses led to adjustment scores. 70 countries are included, and 30 responded, including comments from civil society. The index was supported by 140 lawyers from across the world.
Jesse Worker (WRI) provided the background  The Access Initiative started in 1999 – network of over 200 civil society organisation in over 50 countries that are there to support Principle 10 pillars. The focus of environmental democracy are information, participation and justice. There have been progress since 1992 nad there are other regulations, such as Environmental Impact Assessment (EIA). At the same time, laws are weak or absent in many countries. Practice also lags behind and there is no consistent measurement of progress of laws. The EDI is based on 75 legal indicators, following the Bali guidelines, with 24 supplemental practice (implementation) indicators. They started with 70 countries with 140 lawyers advising. Each country had two – one assessing, and one reviewing. On the website, each country got a short, accessible introduction and the country response is also included on the page. It provided civil society information about government intention. There is also the ability to rank countries. The indicators are based on established framework (Bali guidelines) with limited subjectivity on how they are evaluated, making it easily accessible – and engage governments and stakeholders. Also help civil society to learn about what other achieved. The top countries are Lithuania, Latvia, US, South Africa. It is noted that the signatories to the Aarhus convention, which is binding convention, are doing better. Countries with good laws tend to have better practices, access to information was ahead of public participation. 19 countries responded and they have done score changes. They aim to update it every 2 years, and reach global coverage by 2019. They also aim for Aarhus Convention specific indicators and expend the assessment of implementation.
For Jordan, (Seif Hijazi) commented that the EDI results were below expectation – they expected that the score will be higher, based on their perception of the legal system in their country. The score of Jordan scored 63 out of 70. Some examples: recently enacted law to access and request information from the government, as there are limitation – e.g. the applicant need to demonstrate direct interest which is difficult in law. In public participation, the EIA regulations require public participation – but no legal requirement to consider the comments from the public. Government officials agreed with the scores – and they want to take corrective measures to improve the situation. Jordan is one of the fewer countries in the region to have access to information law.
For Jamaica (Danielle Andrade), the score was especially law on participation, especially environmental impact assessment, policy and law making. The EDI provider a new impetus for working on legislations for public participation – and the government dusted off drafts from 2011 and work on implementations. The assessment of the EDI are used for legal reforms. There is a process of extending Principle 10 in South America and Carribean and the EDI form the position of the country in such negotiations. The score on access to justice the score midway, with lacking support for groups and individuals to fund their representation in court.
Generally, Participation is the pillar that lags behind. Even in democracy there aren’t enough public spaces to engage with government. Comments from Italy, Jordan, Italy and Lebanon about the importance of participation and the need for active civil society to promote it. Jesse – they worked with the TAI network members, because of limited resources, and most European countries are members in Aarhus to develop indicators specific to this system. Participation laws and practice – people need timely information to be informed citizens. People have constraint on their time, and they need timely information in the public domain and know that their comments will be taken into account – you need to know that your comments will be taken seriously. There are gaps between proactive information disclosure and what is done in practice. Requirement to provide information on facilities that have big impact on the environment. Assessing public participation is very difficult. There are also laws that limit the scope of civil society, so it is an ongoing issue that require monitoring.

wpid-wp-1444292368277.jpgThe first plenary of the day developed the theme of the day Creating the enabling environment – getting attention, remembering and acting is important. Opening with Jim Toomey – as a cartoonist, committed to the ocean and worked with UNEP on communication of ocean related issues. There is also revolution in the media industry in terms of sharing it and accessing it, and it is under similar transitions to the data . With his comics, he mixes entertainment with message (e.g. cartoon that is about sustainable sea food). Media is very powerful – e.g. celebrities on the web compared to information on climate change (see pie chart in the slide!). The ability to create content and because it is without much commercial interest, it allows new forms of producing and sharing information. Issues of climate change, or ocean acidification are critical, but the media is not covering it – so do become your own media campaign. He looked at issues with UNEP, including Blue Carbon, Climate Change, Sea Level Rise and more.

The plenary, which included a keynote and short statements, included Inger Andersen (IUCN) chair, with Enrico Giovannini (economist and statistician University of Rome); Carmelle Terborgh (Esri); Patricia Zurita (Bird Life international)

wpid-wp-1444293721980.jpgInger – we stand at a crossroad, and we need to make them with a sense of understanding of the choices that we make. We are on unsustainable path, increased inequalities, stresses on the environment, biodiversity loss etc. We see extinction of species 1000 times the natural rate – we need dramatic change in policy direction and action. We are making choices – the SDGs are not just a list of goals – they are about choosing a different path. Next Paris COP21 will need to demonstrate that we can get on the path for 2 degrees and action towards it. We need good data to make good decisions – we have drops of information from seas of data. The enabling conditions are not there to link data into environmental information that is relevant. The conditions that are needed: financing – IUCN Red Lists and other knowledge products that are needed for many decision-making – the datasets are very cost-effective, with amazing body of volunteers with 300 volunteering years. We don’t see the investment that come with it. The data that go to other system – the global observatory on climate is funded in billions. People are happy to get environmental information for free, but this is not matched with investment. Open data is interesting, but also raise issue for professional scientists of credit, plagiarism etc. There are also the technologies and the use of the data from remote sensing (e.g. WRI Global Forest Watch). In some way, conservation is lagging behind the attention to climate change. How we can we also improve ocean monitoring – we need them to be able to make decisions. Better tools matter also to enable implementation, environmental impact assessment. Lets make tools actionable. Capacity building is key, with different funds – need ‘feet on the ground’ to make conservation possible. We also need the resources to make knowledge available and they get direct benefits from these activities. The conservation movement is a greying movement – how are we going to fire up the new generation, with love of nature? How to inspire children to be part of this army for good.

Enrico – We want to generate information and science to anticipate what people will experience. Enabling conditions are about the overall environment to reach the SDGs goal. A thought experiment is: What a brand new country want to reach the goals? They will have to put SDG in the constitution, and should have assessment of any piece of legislation to check that it fit the SDG. Enabling conditions go beyond financial, technical or statistical conditions. A UN report on the data revolution for sustainable development influenced the statistical monitoring of SDG. We are not moving at an appropriate speed – in the way UN system react, we won’t have baseline until 2019. We need these baseline faster. There is waste of money in international organisations – e.g. in visualisation system or data repositories and lack of sharing data. We need a new social contract with the private sector and companies to get information that is needed for sustainable development data. Speed is required, and we need to avoid waste and share resources.

Carmelle – need to have integrative framework, GIS is a way of bringing issues together. Making it possible to integrate issues that lead to action. GIS is essential to man y decision making, and need to think about networked GIS as a way to allow geographic understanding across organisation. Need to have capacity for people to be able to access information, but also make it possible to access and use open data. Should use maps to tell stories – illustrate key issues. Empowering people through apps and devices is a way to make information useful in context. GIS and geospatial technologies are needed as part of an enabling condition.

Patricia – Bird Life International – 150 organisation about nature, with birds as ambassadors. The issue is how they make impact on the ground. They created IBAs – with huge volunteer effort and multi-million dollars investment. The try to turn information into stories, such as the Marine IBA e-atlas to help protect and conserve areas across the globe. Taking action is about empowering local people, through technology – not just information gatherers, but being able to interpret data and use it for local decision-making. There is need for adequate resources – not only to collect it but also to monitor and continue to invest in it over time, how to ensure that we got the funding to upgrade technology as much as the private sector? How can it be done that without capitalising on intellectual property ? We can have hybrid access models to ensure income. We need to have local to global approach. We need to maintain to continue and maintain the science team so there is the robust understanding of what was collected. We need to turn sources like the Global Environmental Outlook into digestible pills.

Following the panel, the session Principle 10 of Rio Declaration– for better environmental governance and access for all in different regions explored “Efficiency and accountability of policy development can be further enhanced through a more open access to environmental information and data as well as better conditions for public participation in environmental decision-making thus aiming for environmental  governance improvement.
Major progress was achieved in this regards on the regional level since Rio Conference and especially after Rio+20 Conference on Principle 10 (access to information and  public participation on environmental matters) promotion and implementation. Most recent development marks the Latin American and Caribbean regions Principle 10 process where 20 countries launched few months ago the negotiation of Principle 10 regional instrument.”

Alexander Juras (leading on access for all special initiative) chaired. He takes journey of principle 10 in different places. Start with a short video on principle 10 that is used to promote the Latin America process

;

Principle 10 hold government to account, and some government don’t like it – in many regions of the world, it’s an ongoing struggle to make it


wpid-wp-1444330002070.jpg
Carlos De Meguel
(UN ECLAC) – process in the region of Latin America and Caribbean – the government should do their job to enable people to participate. It’s connect human rights, environment and access rights. In Rio+20 20 countries in the region decided to develop an initiative around Principle 10, with a link to different goals in the SDG. Principle 16 ad 17 are explicitly connected to Principle 10. There is progress, but challenges: lack of regulations, and many people without access to information due to economic, social, and political reasons. Sometime the information itself is lacking – need alternative ways of resolving conflicts. The need for regional agreement is to maintain compliance, to allow collaboration and increase commitments.The new agreement can potentially impact 500 million people. The process evolved from 2012 to 2014, with final negotiation starting now. There are many resolutions, also in intergovernmental forums – a lot of political backing. Structure of the document include the 3 pillars and other aspects – with reference to Bali guidelines and other developments since Principle 10. There is also wide public consultation on the document. Aim to reach it by 2016.

wpid-wp-1444327816347.jpgDanielle Andrade (lawyer for Jamaica/ TAI)  and Andrea Sanhueza (founder of TAI)- Danielle opened, discussing the impact of access that influenced people’s life. The Caribbean are not only a holiday spot, for example she told the story of state-owned sewage plant that was malfunctioning since the 1970s, but continue to receive effluent  and created local problems. Only with an NGOs they manage to bring court action about neglecting the site, and use freedom of information to demonstrate that people were charged to pay for fixing the plants. That led to fixing the sewage plant. Andrea talked about examples from Ecuador – in 2004, in Tumbaco, some people had headaches and skin condition. They done tests and suffer from arsenic poisoning. They water system was managed by the municipality and they set out public group for water without arsenic, and they used attention in the media and investigation of the case by the government. The analysis included a range of tests, showing the impact of blood contamination that came from an external lab in Canada. The municipal company argued that they can’t deal with the pipes, but changed the source of water and that helped in solving the situation.

Tsvetelina Filipova (REC for Central and Eastern Europe) – Building Bridges between regions http://building-bridges.rec.org/ – Aarhus change behaviour of government and people who understood that they have a right – that’s because it was legally binding. The process was not ideal, and lots of countries had difficulties – many countries were ready, but even the countries that thought that they are good in Principle 10 legislations, failed many time. The project is about inter-regional cooperation  and helped in sharing the experience from Aarhus to Latin America. Some benefits: supporting the negotiation process and have experience on how to deal with issues that come up. There is also experience and interregional experience on how to implement, and also empowering stakeholders. In all these initiatives it is people who are pushing the process forward. The process require funding so it is inclusive enough. The implementation of the bridges was through training and live on-line exchange seminars – sharing good practice, draw recommendations on running the process efficiently. The benefits: designing, drafting, negotiating, implementation and interpreting. Some of the people are involved in working on these issues since 1996.

Alexander Juras – The Aarhus also helped in instilling democratic values in many countries that use to be part of the Soviet union.

wpid-wp-1444302028231.jpgJeremy Wates (European Environmental Bureau, past secretary of Aarhus) – development of Principle 10 in the Middle East and north Africa region. It is not enough to have environmental information system, if you don’t provide the legal rights – don’t treat it as a marginal aspect to Eye on Earth framework, it need to be central processes. The second point is that Aarhus convention is not being talked about enough, and taken for granted, not that it all gone right and there are real challenges to fit within it – even today the EU is struggling to comply. Aarhus apply to many countries with long and shorter experiences of democracy. The building bridges is about a forum for dialogue – lots of mistake that can be learnt from. The next region to open this dialogue in is the MENA region, but the political situation in the region lead to select few countries to start. Starting with Morocco, Tunisia, Jordan and UAE, with the aim to implement Bali guidelines better. Some of them been aware of the Aarhus process. The hope is to get enhanced environmental performances, and participatory government. They see 4 main actions in the process: raising awareness, carry out gap analysis to see what is already in place and what information already available. Then encourage government and civil society in Aarhus convention process and strengthen civil society organisations and network. That is aimed over 2 years projects. The issue is to get partnerships going

Stephen Stec (Central European University, author of the Bali guidelines and Aarhus implementation guide). covered Bali Guidelines – an effective tool for implementing Rio Principle 10 at the national level. The standards for the rest of the world are the Bali guidelines from 2010. It’s global instrument for Principle 10, and base on national experience and the international experience from Aarhus. He covered the Bali guidelines – they are voluntary and request driven, to help filling gaps in national legislations. There are 26 guidelines – most in access to justice, and the early one are about access to information. In the Access of All special initiative of EoE included several outcomes – the environmental democracy index, then UNITAR national profiles that is part of the Environmental Governance Programme – national assessment and tailored capacity building. UNEP also run Regional Workshops to promote multi-stakeholder dialogue on Rio Principle 10 and the guidelines and the implementation guide on Bali guidelines that was launch on the first day.

Discussion: moving beyond Principle 10 and starting to think about how we support public production of environmental information? This is a growing area, and the information completely changed. Seeing citizen science that it will take care of itself – say the chemical release inventory, worth putting the effort in the current extension of principle 10 into more areas. For less developed IT countries for further promotion of right, improving active citizenship can be done through citizen science. Public Production of information – if you want to provide data in non-traditional data is the issue of recognition and allowing it to be used in decision making processes. Daniella give an example of community data collection in a mining case to complete the data gap. In the current Principle 10 , require certain standards – and it might find its way into agreement

A question from Cameroon about the legal framework for access to justice in terms of cost, expertise and when the bridge will reach Africa? the experience is that you need government that is committed to the idea of regional convention, which also have leadership in transparency, stability, open etc. That work in the LAC area. The limitation on building bridge are the costs of extending projects.

From Mauritius – for small island states – Rest of the World region. Working with UNEP is very complicated and as new network how they can work together when they have limited ability.

Eye on Earth (Day 2 – Afternoon) – Cost of knowledge, citizen science & visualisation

The first afternoon session was dedicated to Understanding the Costs of Knowledge – Cost of Data Generation and Maintenance (my second day morning post is here)

DSCN1220The session was moderated by Thomas Brooks (IUCN) – over the last couple of days we heard about innovation in mobilisation of environmental and socio-economic data. All these innovations have price tag, and some are quite large. Need budget for it and pay for it accordingly. Establishing costs for knowledge products in biodiversity is important. First, four products are explored and then the costs analysed.

DSCN1221Richard Jenkins – IUCN read list of Threatened Species. He explain the list and the essential procedures and components that created it. The red list is a framework for classifying threatened species with different classifications with vulnerable, engendered or critically engendered are included in the list. It’s critical source for conservation – over 75,000 species, with over 3,000,000 people visiting the website each year to find information. The foundation of the information is a structured process with ongoing cycles of evaluation and analysis. They are based on donor support – volunteer time in data collections, as well as professional time to evaluate the information and running an on-line database. Costs include workshops, training and travel, for professional time there is communications, researchers, developers, fund raisers and ICT costs: hosting, maintenance, software licensing, hardware etc. The costs can be one-off (setting new system), recurring costs (evaluations) and annual costs (systems and people). Need partnerships, voluntarism – essential and need to be recognised. Re-assessment are needed and also developing tools & uptake

DSCN1222Jon Paul Rodriguez – IUCN Red List of Ecosystems, as an emerging product – ecosystem collapse is transformation beyond typical situation. Example for this is Aral Sea – with impact on wildlife and human life around it. They use a risk model for ecosystems with 4 symptoms as criteria. Similar categories to the species red list. They do global assessment at continental scale and national scale. Costs: compilation of data which are spatial information is complex, time consuming and challenging. There is economy of scale is you do it at regional / global analyses, and first assessment is costly, but updates will be cheaper. The benefits: ecosystem mapping can be used for other knowledge products (e.g. protected areas), capacity-building model, and doing it with open access data. The potential of integration with the two red lists there is a more effective products. Commercial users will need to pay.

Ian May – birdlife international –  key biodiversity areas (KBAs). Set of information about sites that are identified for biodiversity conservation using standard criteria by a range of bodies. There are important bird areas, critical ecosystem partnership fund areas (particular hotspots in multiple taxa). Future direction is to standardise the KBAs. They are used into IFC Performance Standard 6 that force development banks to take them into account, they are integrated in Natura 2000 Birds Directive and in CBD Aichi Targets.

DSCN1224Naomi Kingston – WCMC – protected area (Protected Planet product) – it’s a project about deliver, connect, analyse and change – world database on protected areas. Have been in development since 1959, evolving from list of national parks and equivalent reserves. There are 700 data providers globally but also NGOs and community groups. Database that evolved over time need to be treated carefully and consider what each polygon and point mean. There is 91.3% polygon data, and grown from 41,305 sites in 1998 to 200,000 today. They raise profile through different activities. There is a website – www.protectedplanet.net . Data is supposed to be updated every 5 years, and is used in SDGs, academic research and strategic plan for biodiversity. They want to see decisions that are based on it – e.g. IBAT that support business. There is direct connection between resources that are available to the ability to provide training, outreach and capacity building .

DSCN1225Dieggo Juffe – costing the knowledge products – he assessed the financial investment in developing and maintaining biodiversity information. evaluating development costs to 2013, maintenance and future costs. The datasets that were covered are used in decision making, academic research and more. They developed methodology to evaluate primary data collection costs, network supporting costs, national red lists of species, and the costs of producing scientific papers. They looks on different aspects: personnel, infrastructure, workshop & travel and publication and outreach, looking at all the funding – from donors, private sector, government, NGOs etc., including volunteer time and converted it to USD in 2014. Looked at data since the 1980s to 2013. Today, investment between $116 to $204 USD in development and maintenance. 67,000 to 73,000 volunteer days – almost 200 years. Annual investment 6.5 Mil and 12.5 volunteer days/year . Most was funded from philanthropy (53%) and government 27%. Very large investment in personnel. They exect that future investment to 2020 will be in the range of 100 mil USD. That will give us a comprehensive baseline. Without data we can’t make decision, This is very small compare to census running to other systems. Some of the open questions: what’s the impact of this investment? are there better way to make the products even more cost-effective? what is the real cost of volunteer time? How to avoid duplication of effort?

wpid-wp-1444253313774.jpgA second afternoon session focused on Everyone is a supplier: Crowd-sourcing and citizen science and indigenous knowledge. Craig Hanson (WRI) opened with a comment that there is a lot of data from remote sensing, professional scientists – but what the role of citizens? there are 7 billion mobile phone and worldwide and with near global Internet connectivity, citizens anywhere are now capable of being the eyes and ears of the planet.  The session looked at successful approaches for engaging people to crowd-source data and contribute to citizen science, and how indigenous knowledge can be systematically integrated into decision-making. With applications from around the world. WRI is  also involved in this process, and in global forest watch – started from partners processing data, but satellite can’t see everything, and JGI and WRI use ODK  to provide ground truth on forest clearing.

Jacquie McGlade covered UNEP Live – citizen science mentioned many time in the summit, but now we need to make voices heard. We need alternative models of how the world operate. All UNEP assessment will include alternative views of mother earth – a challenge for western science point of view. UNEP Live was designed to give citizen access to data that was collected by governments, but now it also include citizen science – there are now legislations that include rights for people to gather data and making sure that these data are used in decision making. It’s all about co-production of knowledge. From the structured world with metadata and schema to the unstructured data of social media and NGOs. The idea of co-prodcution of knowledge, require management of knowledge with ontologies, and noticing 23 different definition of legal, many definition of access or forest and this is a challenge. SDG interface catalogue is providing the ontology. Example from climate change in the Arctic or in species monitoring in ecosystem capital account that involve forest communities. Motivating people is important – air quality is a great opportunity for citizen science with local interest with information. People in Kibera were willing to pay for access to air quality equipment as they see it as important for their children.

Brian Sullivan (Google Earth Outreach) – everyone is supplier. Indigenous groups using tools for telling stories, environmental monitoring and the protected area of the Surui is been included in partnership with Google. They’ve done cultural mapping with the Surui and worked with other communities who decide if they want to make it public or private. Environmental monitoring was another activity – using ODK. They build resource use and other information that help to protect the land. They are working with other groups in Brazil. Another project is Global Fishing Watch – visualising fishing fleet. Using machine learning, they have been monitoring fishing, and it also allow you to zoom in to specific ship. Monitoring areas when there are limited resources and they can’t enforce by sending ship.

wpid-wp-1444253326705.jpgTunitiak Katan looking at his tribal territory in Ecuador – the national context, indigenous people, in climate change and measurement. Ecuador have many indigenous groups – 11 different cultures. He was involved in carbon estimation and ecosystem assessment. Working with different groups using traditional ecological knowledge (ancestors knowledge). The explore the issues of climate discussions with different groups from 9 cultures, with 312 people discussing REDD/REDD+. They carried out measurements in the Amazon demonstrating carbon capture. Now they carry out a project at Kutukú-Shaim region for conservation, restoration and management, selected because the area got a lot of rivers that feed the Amazon river. They aim to achieve holistic management. “We and the forest are one”.

Nick Wright from @crwodicity – belief that in each organisation or community that are transformative ideas that are not seeing the light of day. We are more connected than ever before. Technology change the way people link and interact and becoming the norm. Connectivity make technologies part of the solution, and the vast majority of the world will benefit from this connectivity. It’s about not just collecting the information but also to connect the dots and make sense of it. Increase connectivity is challenging hierarchy. How can citizens participate in decision making and opportunity to participate. The crowdsourcing is a way to strengthen relationship between government and the people. Crowdicity worked with Rio to explore the Olympic legacy. They created Agora Rio to allow people to discuss issues and make the city better. They started on-line and move to the real world – pop-up town hall meetings – coordinate community groups and reach out from the on-line to those who didn’t access.They had a process to make it possible to work on-line and off-line. Led to 24 proposals for projects, of which 4 are going forward and done in cycle of 12 week. The importance is to create social movement for the period of time – sense of energy. Crowdsourcing can work in the UN system – post-2015 development agenda, help to amplify the conversation to 16 million people around the world – take views from across the world – BYND 2015 is the first ever crowdsourced UN declaration.

Andrew Hill of @cartodb covering the importance of citizen science in Planet Hunters, but wanted to mostly wanted to talk maps. How to engage people who can contribute code or technical skills. GitHub is a system that is central to technology working. Successful project can have many participants. It’s a community of 10 million users. How can we find coders for my project? But lots of time there is lack of contribution apart from the lead? We need to engage people to create technologies for communities. Hackathon can be problematic without thinking beyond the specific event. Need to consider small grant, and also thinking about people somewhere between code and use. Maps might be the data visualisation type that change people behaviour most often. Maybe a tool to make things easy – it should be a map? Website like timby.org can allow people to tell their story. CartoDB also make it possible for people to take data and show it in different ways.

Discussion: getting to the idea is possible, but then there is a challenge is to keep them engaged. Suggestion: give information back and see the value in information. Need to have feedback loop for people to see what they learned, building expertise, A personal journey of learning is important.

The final plenary was Reaching audiences through innovations in visualisation for people to act on information, they need to understand it. Visualisation can increase that understanding. Bringing together leading experts and practitioners, this plenary will showcase innovations in data visualisation and application that advance sustainable development.

 

Janet Ranganathan shown the WRI Resource Watch. There is a gap between data provision and data use – a lot of open data portals – you get lost. Need to help people to listen to the signal of the planet and act on it. The opportunity is the whole data that is coming out. Based on global forest watch, they focus on the Nexus: water, food, energy, forests. Provide access to data, but also analysis and then sharing the insights.

Craig Mills talked about visuality experience – it’s not data revolution but it’s about presenting information. Need to create fusion between data and story telling. He provided a walk through of ResearchWatch showing how to make information personal and need to redefine of displaying maps – following convention from GIS. There are ways of thinking about visualisation principles. Stop to think about sharing – see the connection before things are displayed on the map. How to get your data to where people are already using. Make it easy to embed in other places – make a big share button. Use emotions and feeling in terms of connection. Context is the secret – expect people to use things on phones, or tablet. Actually thinking about information as mobile first. Also voice activated and SMS and we can reach everyone

Angela Oduor Lungati – Ushahidi – explore the marginalisation is not from scarcity, but poverty, power and inequality (UN Human Development Report 2006). She show how privatisation of water reduce access to water. Usahidi is a platform that allow ordinary citizens to raise their voice and share information. Information can use SMS, web or smartphone – whatever people have. Allowing data collection, management, visualisation and alerts. Pothole theory – there is an event that trigger your action – and need to be local and personal. Kathmandu Living Labs use Ushahidi to find proper assessment in QuakeMap.org. The tool is also used by theLouisiana Bucket Brigade. Usahidi was used by 18M people and 159 countries, and it is made in Africa. Suggest the metaphor of data = seeds; land = platforms and farmers are the people. Technology just 10% of the solution.

Trista Patterson – NewMedia Lab at GRID-Arendal – history of many reports and viral graphics. NewMedia Lab is to invigorate radical experimentation & rapid prototyping – moving beyond paper focus design. Connecting people with data, the audience and emotions. Dependence on technology increase, instead of envisioning what it is that we deeply need most – our need for envisioning, and we need to exercise this capability. They explore relationship with artists, envisioning with children. Data + emotions = decisions and actions. Iterations and endurance in experimentations.

The last side event Citizen Scientists and their role in monitoring of local to global environmental change – explored project in Abu Dhabi that involves divers in recording data about sharks and a project in Bahrain – regional movement of Arab Youth Climate Movement. Citizen Science programme, choose to use iNaturalist in Bahrein as a way to make people less blind to nature. Use iNaturalist, small session open to the public in a natural world heritage site – introduce the concept of citizen science which is not known to the public, and let them use the app to help to identify species, and would like to see people engage from a younger age in citizen science. Challenge in Abu Dhabi with an engagement with divers monitoring sharks when the Gulf is major exporter of fins. Initiatives take time to develop, and in Abu Dhabi they have challenge that divers are ex-pat who stay for some years and then leave, so require to continue to recruit people.

Eye on Earth (Day 2 – Morning) – moving to data supply

Eye on Earth (Day 2 – Morning) – moving to data supply The second day of Eye on Earth moved from data demand to supply . You can find my posts from day one, with the morning and the afternoon sessions. I have only partial notes on the plenary Data Revolution-data supply side, although I’ve posted separately the slides from my talk. The description of the session stated: The purpose of the the session is to set the tone and direction for the “data supply” theme of the 2nd day of the Summit. The speakers focused on the revolution in data – the logarithmic explosion both in terms of data volume and of data sources. Most importantly, the keynote addresses will highlight the undiscovered potential of these new resources and providers to contribute to informed decision-making about environmental, social and economic challenges faced by politicians, businesses, governments, scientists and ordinary citizens.

The session was moderated by Barbara J. Ryan (GEO) the volume of data that was download in Landsat demonstrate the information revolution. From 53 scene/day to 5700 scene/day once it became open data – demonstrate the power of open. Now there are well over 25 million downloads a year. There is a similar experience in Canada, and there are also new and innovative ways to make the data accessible and useful.

The first talk was from Philemon Mjwara (GEO), the amount of data is growing and there is an increasing demand for Earth Observations, but even in the distilled form of academic publications there is an explosion and it’s impossible to read everything about your field. Therefore we need to use different tools – search engines, article recommendation systems. This is also true for EO data – users need the ability to search, then process and only then they can use the information. This is where GEO come in. It’s about comprehensive, effective and useful information. GEO works with 87 participating organisations. They promote Open Data policies across their membership, as this facilitate creation of a global system of systems (GEOSS). GEOSS is about supply, and through the GEO infrastructure it can be share with many users. We need to remember that the range of sources is varied: from satellite, to aerial imagery, to under-sea rovers. GEO works across the value chain – the producers, value added organisation and the users. An example of this working is in analysis that helps to link information about crops to information about potential vulnerability in food price.

Mary Glackin (the Weather Corporation), reviewed how weather data is making people safer and business smarter. The Weather Company is about the expression of climate in the patterns of weather. Extreme events make people notice. Weather is about what happen in the 100 km above the Earth surface, but also the 3.6 km average depth of the oceans, which we don’t properly observe yet and have an impact on weather. There are 3 Challenges: keep people safe, helping businesses by forecasting, and engage with decision makers. Measuring the atmosphere and the oceans is done by many bodies which go beyond official bodies – now it includes universities, companies, but also citizens observations which is done across the world (through Weather Underground). The participants, in return, receive a localised forecast for their area and details of nearby observations. It’s a very large citizen science project, and engagement with citizen scientists is part of their work. Forecasting require complex computer modelling – and they produce 11 Billion forecasts a day. Engaging decision makers can be individual fisherman who need to decide if to go out to sea or not. There is a need for authoritative voice that create trust when there are critical issues such as response to extreme events. Another example is the use of information about turbulence from airplanes which are then used to improve modelling and provide up to date information to airlines to decide on routes and operations. Technology is changing – for example, smartphones now produce air pressure data and other sensing abilities that can be used for better modelling. There are policies that are required to enable data sharing. While partnerships between government and private sector companies. A good example is NOAA agreeing to share all their data with cloud providers (Microsoft, Amazon, Google) on the condition that the raw data will be available to anyone to download free of charge, but the providers are free to create value added services on top of the data.

Next was my talk, for which a summary and slide are available in a separate post.

Chris Tucker (MapStory) suggested that it is possible to empower policy makers with open data. MapStory is an atlas of changes that anyone can edit, as can be seen in the development of a city, or the way enumeration district evolved over time. The system is about maps, although the motivation to overlay information and collect it can be genealogy – for example to be able to identify historical district names. History is a good driver to understand the world, for example maps that show the colonisation of Africa. The information can be administrative boundaries, imagery or environmental information. He sees MapStory as a community. Why should policy makers care? they should because ‘change is the only constant’, and history help us in understanding how we got here, and think about directions for the future. Policy need to rely on data that is coming from multiple sources – governmental sources, NGOs, or citizens’ data. There is a need for a place to hold such information and weave stories from it. Stories are a good way to work out the decisions that we need to make, and also allow ordinary citizens to give their interpretation on information. In a way, we are empowering people to tell story.

The final talk was from Mae Jemison (MD and former astronaut). She grow up during a period of radical innovations, both socially and scientifically – civil rights, new forms or dance, visions of a promising future in Start Trek, and the Apollo missions. These have led her to get to space in a Shuttle mission in 1992, during which she was most of the time busy with experiments, but from time to time looked out of the window, to see the tiny sliver of atmosphere around the Earth, within which whole life exist. Importantly, the planet doesn’t need protection – the question is: will humans be in the future of the planet? Every generation got a mission, and ours is to see us linked to the totality of Earth – life, plants and even minerals. Even if we create a way to travel through space, the vast majority of us will not get off this planet. So the question is: how do we get to the extraordinary? This lead us to look at data, and we need to be aware that while there is a lot of it, it doesn’t necessarily mean information, and information doesn’t mean wisdom. She note that in medical studies data (from test with patients) have characteristics of specificity (relevant to the issue at hand) and sensitivity (can it measure what we want to measure?). We tend to value and act upon what we can measure, but we need to consider if we are doing it right. Compelling data cause us to pay attention, and can lead to action. Data connect us across time and understanding a universe grater that ourselves, as the pictures from Hubble telescope that show the formation of stars do. These issues are coming together in her current initiative “100 years starship” – if we aim to have an interstellar ship built within the next 100 years, we will have to think about sustainability, life support and ecosystems in a way that will help us solve problems here on Earth. It is about how to have an inclusive journey to make transformation on Earth. She completed her talk by linking art, music and visualisation with the work of Bella Gaia

After the plenary, the session Data for Sustainable Development was building on the themes from the plenary. Some of the talks in the session were:

Louis Liebenberg presented cybertracker – showing how it evolved from early staged in the mid 1990s to a use across the world. The business model of cybertracker is such that people can download it for free, but it mostly used off-line in many places, with majority of the users that use it as local tool. This raise issues of data sharing – data doesn’t go beyond that the people who manage the project. Cybertracker address the need to to extend citizen science activities to a whole range of participants beyond the affluent population that usually participate in nature observations.

Gary Lawrence – discussed how with Big Data we can engage the public in deciding which problem need to be resolved – not only the technical or the scientific community. Ideas will emerge within Big Data that might be coincident or causality. Many cases are coincidental. The framing should be: who are we today? what are we trying to become? What has to be different two, five, ten years from now if we’re going to achieve it? most organisations don’t even know where they are today. There is also an issue – Big Data: is it driven by a future that people want. There are good examples of using big data in cities context that take into account the need of all groups – government, business and citizens in Helsinki and other places.

B – the Big Data in ESPA experience www.espa.ac.uk – data don’t have value until they are used. International interdisciplinary science for ecosystems services for poverty alleviation programme. Look at opportunities, then the challenges. Opportunities: SDGs are articulation of a demand to deliver benefits to societal need for new data led solution for sustainable development, with new technologies: remote sensing / UAVs, existing data sets, citizen science and mobile telephony, combined with open access to data and web-based applications. Citizen Science is also about empowering communities with access to data. We need to take commitments to take data and use it to transforming life.

Discussion: lots of people are sitting on a lots of valuable data that are considered as private and are not shared. Commitment to open data should be to help in how to solve problems in making data accessible and ensure that it is shared. We need to make projects aware that the data will be archived and have procedures in place, and also need staff and repositories. Issue is how to engage private sector actors in data sharing. In work with indigenous communities, Louis noted that the most valuable thing is that the data can be used to transfer information to future generations and explain how things are done.