My contribution to the discussion is based on previous thoughts on environmental information and public use of it. Inherently, I see the relationships between environmental decision-making, information, and information systems as something that need to be examined through the prism of the long history that linked them. This way we can make sense of the current trends. This three area are deeply linked throughout the history of the modern environmental movement since the 1960s (hence the Apollo 8 earth image at the beginning), and the Christmas message from the team with the reference to Genesis (see below) helped in making the message stronger .
To demonstrate the way this triplet evolved, I’m using texts from official documents – Stockholm 1972 declaration, Rio 1992 Agenda 21, etc. They are fairly consistent in their belief in the power of information systems in solving environmental challenges. The core aspects of environmental technophilia are summarised in slide 10.
This leads to environmental democracy principles (slide 11) and the assumptions behind them (slide 12). While information is open, it doesn’t mean that it’s useful or accessible to members of the public. This was true when raw air monitoring observations were released as open data in 1997 (before anyone knew the term), and although we have better tools (e.g. Google Earth) there are consistent challenges in making information meaningful – what do you do with Environment Agency DSM if you don’t know what it is or how to use a GIS? How do you interpret Global Forest Watch analysis about change in tree cover in your area if you are not used to interpreting remote sensing data (a big data analysis and algorithmic governance example)? I therefore return to the hierarchy of technical knowledge and ability to use information (in slide 20) that I covered in the ‘Neogeography and the delusion of democratisation‘ and look at how the opportunities and barriers changed over the years in slide 21.
The last slides show that despite of all the technical advancement, we can have situations such as the water contamination in Flint, Michigan which demonstrate that some of the problems from the 1960s that were supposed to be solved, well monitored, with clear regulations and processes came back because of negligence and lack of appropriate governance. This is not going to be solved with information systems, although citizen science have a role to play to deal with the governmental failure. This whole sorry mess and the re-emergence of air quality as a Western world environmental problem is a topic for another discussion…
The workshop ‘Algorithmic Governance’ was organised as an intensive one day discussion and research needs development. As the organisers Dr John Danaher
and Dr Rónán Kennedy identified:
‘The past decade has seen an explosion in big data analytics and the use of algorithm-based systems to assist, supplement, or replace human decision-making. This is true in private industry and in public governance. It includes, for example, the use of algorithms in healthcare policy and treatment, in identifying potential tax cheats, and in stopping terrorist plotters. Such systems are attractive in light of the increasing complexity and interconnectedness of society; the general ubiquity and efficiency of ‘smart’ technology, sometimes known as the ‘Internet of Things’; and the cutbacks to government services post-2008. This trend towards algorithmic governance poses a number of unique challenges to effective and legitimate public-bureaucratic decision-making. Although many are already concerned about the threat to privacy, there is more at stake in the rise of algorithmic governance than this right alone. Algorithms are step-by-step computer coded instructions for taking some input (e.g. tax return/financial data), processing it, and converting it into an output (e.g. recommendation for audit). When algorithms are used to supplement or replace public decision-making, political values and policies have to be translated into computer code. The coders and designers are given a set of instructions (a project ‘spec’) to guide them in this process, but such project specs are often vague and underspecified. Programmers exercise considerable autonomy when translating these requirements into code.The difficulty is that most programmers are unaware of the values and biases that can feed into this process and fail to consider how those values and biases can manifest themselves in practice, invisibly undermining fundamental rights.This is compounded by the fact that ethics and law are not part of the training of most programmers.Indeed, many view the technology as a value-neutral tool. They consequently ignore the ethical ‘gap’ between policy and code. This workshop will bring together an interdisciplinary group of scholars and experts to address the ethical gap between policy and code.
The workshop was structured around 3 sessions of short presentations of about 12 minutes, with an immediate discussion, and then a workshop to develop research ideas emerging from the sessions. This very long post are my notes from the meeting. These are my takes, not necessarily those of the presenters. For another summery of the day, check John Danaher’s blog post.
Session 1: Perspective on Algorithmic Governance
Professor Willie Golden (NUI Galway) ‘Algorithmic governance: Old or New Problem?’ focused on an information science perspective. We need to consider the history – an RO Mason paper from 1971 already questioned the balance between the decision-making that should be done by humans, and that part that need to be done by the system. The issue is the level of assumptions that are being integrated into the information system. Today the amount of data that is being collected and the assumption on what it does in the world is a growing one, but we need to remain sceptical at the value of the actionable information. Algorithms needs managers too. Davenport in HBR 2013 pointed that the questions by decision makers before and after the processing are critical to effective use of data analysis systems. In addition, people are very concerned about data – we’re complicit in handing over a lot of data as consumers and the Internet of Things (IoT) will reveal much more. Debra Estrin 2014 at CACM provided a viewpoint – small data, where n = me where she highlighted the importance of health information that the monitoring of personal information can provide baseline on you. However, this information can be handed over to health insurance companies and the question is what control you have over it. Another aspect is Artificial Intelligence – Turing in 1950’s brought the famous ‘Turing test’ to test for AI. In the past 3-4 years, it became much more visible. The difference is that AI learn, which bring the question how you can monitor a thing that learn and change over time get better. AI doesn’t have self-awareness as Davenport 2015 noted in Just How Smart are Smart Machines and arguments that machine can be more accurate than humans in analysing images. We may need to be more proactive than we used to be.
Dr Kalpana Shankar (UCD), ‘Algorithmic Governance – and the Death of Governance?’ focused on digital curation/data sustainability and implication for governance. We invest in data curation as a socio-technical practice, but need to explore what it does and how effective are current practices. What are the implications if we don’t do ‘data labour’ to maintain it, to avoid ‘data tumbleweed. We are selecting data sets and preserving them for the short and long term. There is an assumption that ‘data is there’ and that it doesn’t need special attention. Choices that people make to preserve data sets will influence the patterns of what appear later and directions of research. Downstream, there are all sort of business arrangement to make data available and the preserving of data – the decisions shape disciplines and discourses around it – for example, preserving census data influenced many of the social sciences and direct them towards certain types of questions. Data archives influenced the social science disciplines – e.g. using large data set and dismissing ethnographic and quantitative data. The governance of data institutions need to get into and how that influence that information that is stored and share. What is the role of curating data when data become open is another question. Example for the complexity is provided in a study of a system for ‘match making’ of refugees to mentors which is used by an NGO, when the system is from 2006, and the update of job classification is from 2011, but the organisation that use the system cannot afford updating and there is impacts on those who are influenced by the system.
Professor John Morison (QUB), ‘Algorithmic Governmentality’. From law perspective, there is an issue of techno-optimism. He is interested in e-participation and participation in government. There are issue of open and big data, where we are given a vision of open and accountable government and growth in democratisation – e.g. social media revolution, or opening government through data. We see fantasy of abundance, and there are also new feedback loops – technological solutionism to problems in politics with technical fixes. Simplistic solutions to complex issues. For example, an expectation that in research into cybersecurity, there are expectations of creating code as a scholarly output. Big Data have different creators (from Google to national security bodies) and they don’t have the same goals. There is also issues of technological authoritarianism as a tool of control. Algorithmic governance require to engage in epistemology, ontology or governance. We need to consider the impact of democracy – the AI approach is arguing for the democratisation through N=all argument. Leaving aside the ability to ingest all the data, what is seemed to assume that subjects are not viewed any more as individuals but as aggregate that can be manipulated and act upon. Algorithmic governance, there is a false emancipation by promise of inclusiveness, but instead it is responding to predictions that are created from data analysis. The analysis is arguing to be scientific way to respond to social needs. Ideas of individual agency disappear. Here we can use Foucault analysis of power to understand agency. Finally we also see government without politics – arguing that we make subjects and objects amenable to action. There is not selfness, but just a group prediction. This transcend and obviates many aspects of citizenship.
Niall O’Brolchain (Insight Centre), ‘The Open Government’. There is difference between government and governance. The eGov unit in Galway Insight Centre of Data Analytics act as an Open Data Institute node and part of the Open Government Partnership. OGP involve 66 countries, to promote transparency, empower citizens, fight corruption, harness new technologies to strengthen governance. Started in 2011 and involved now 1500 people, with ministerial level involvement. The OGP got set of principles, with eligibility criteria that involve civic society and government in equal terms – the aim is to provide information so it increase civic participation, requires the highest standards of professional integrity throughout administration, and there is a need to increase access to new technologies for openness and accountability. Generally consider that technology benefits outweigh the disadvantages for citizenship. Grand challenges – improving public services, increasing public integrity, public resources, safer communities, corporate accountability. Not surprisingly, corporate accountability is one of the weakest.
Using the Foucault framework, the question is about the potential for resistance that is created because of the power increase. There are cases to discuss about hacktivism and use of technologies. There is an issue of the ability of resisting power – e.g. passing details between companies based on prediction. The issue is not about who use the data and how they control it. Sometime need to use approaches that are being used by illegal actors to hide their tracks to resist it.
A challenge to the workshop is that the area is so wide, and we need to focus on specific aspects – e.g. use of systems in governments, and while technology is changing. Interoperability. There are overlaps between environmental democracy and open data, with many similar actors – and with much more government buy-in from government and officials. There was also technological change that make it easier for government (e.g. Mexico releasing environmental data under OGP).
Sovereignty is also an issue – with loss of it to technology and corporations over the last years, and indeed the corporate accountability is noted in the OGP framework as one that need more attention.
There is also an issue about information that is not allowed to exists, absences and silences are important. There are issues of consent – the network effects prevent options of consent, and therefore society and academics can force businesses to behave socially in a specific way. Keeping of information and attributing it to individuals is the crux of the matter and where governance should come in. You have to communicate over the internet about who you are, but that doesn’t mean that we can’t dictate to corporations what they are allowed to do and how to use it. We can also consider of privacy by design.
Session 2: Algorithmic Governance and the State
Dr Brendan Flynn (NUI Galway), ‘When Big Data Meets Artificial Intelligence will Governance by Algorithm be More or Less Likely to Go to War?’. When looking at autonomous weapons we can learn about general algorithmic governance. Algorithmic decision support systems have a role to play in very narrow scope – to do what the stock market do – identifying very dangerous response quickly and stop them. In terms of politics – many things will continue. One thing that come from military systems is that there are always ‘human in the loop’ – that is sometime the problem. There will be HCI issues with making decisions quickly based on algorithms and things can go very wrong. There are false positive cases as the example of the USS Vincennes that uses DSS to make a decision on shooting down a passenger plane. The decision taking is limited by the decision shaping, which is handed more and more to algorithms. There are issues with the way military practices understand command responsibility in the Navy, which put very high standard from responsibility of failure. There is need to see how to interpret information from black boxes on false positives and false negatives. We can use this extreme example to learn about civic cases. Need to have high standards for officials. If we do visit some version of command responsibility to those who are using algorithms in governance, it is possible to put responsibility not on the user of the algorithm and not only on the creators of the code.
Dr Maria Murphy (Maynooth), ‘Algorithmic Surveillance: True Negatives’. We all know that algorithmic interrogation of data for crime prevention is becoming commonplace and also in companies. We know that decisions can be about life and death. When considering surveillance, there are many issues. Consider the probability of assuming someone to be potential terrorist or extremist. In Human Rights we can use the concept of private life, and algorithmic processing can challenge that. Article 8 of the Human Right Convention is not absolute, and can be changed in specific cases – and the ECHR ask for justifications from governments, to show that they follow the guidelines. Surveillance regulations need to explicitly identify types of people and crimes that are open to observations. You can’t say that everyone is open to surveillance. When there are specific keywords that can be judged, but what about AI and machine learning, where the creator can’t know what will come out? There is also need to show proportionality to prevent social harm. False positives in algorithms – because terrorism are so rare, there is a lot of risk to have a bad impact on the prevention of terrorism or crime. The assumption of more data is better data, we left with a problem of generalised surveillance that is seen as highly problematic. Interestingly the ECHR do see a lot of potential in technologies and their potential use by technologies.
Professor Dag Weise Schartum (University of Oslo), ‘Transformation of Law into Algorithm’. His focus was on how algorithms are created, and thinking about this within government systems. They are the bedrock of our welfare systems – which is the way they appear in law. Algorithms are a form of decision-making: general decisions about what should be regarded, and then making decisions. The translation of decisions to computer code, but the raw material is legal decision-making process and transform them to algorithms. Programmers do have autonomy when translating requirements into code – the Norwegian experience show close work with experts to implement the code. You can think of an ideal transformation model of a system to algorithms, that exist within a domain – service or authority of a government, and done for the purpose of addressing decision-making. The process is qualification of legal sources, and interpretations that are done in natural language, which then turn into specification of rules, and then it turns into a formal language which are then used for programming and modelling it. There are iterations throughout the process, and the system is being tested, go through a process of confirming the specification and then it get into use. It’s too complex to test every aspect of it, but once the specifications are confirmed, it is used for decision-making. In terms of research we need to understand the transformation process in different agency – overall organisation, model of system development, competences, and degree of law-making effects. The challenge is the need to reform of the system: adapting to changes in the political and social change over the time. Need to make the system flexible in the design to allow openness and not rigidness.
Heike Felzman (NUI Galway), ‘The Imputation of Mental Health from Social Media Contributions’ philosophy and psychological background. Algorithms can access different sources – blogs, social media and this personal data are being used to analyse mood analysis, and that can lead to observations about mental health. In 2013, there are examples of identifying of affective disorders, and the research doesn’t consider the ethical implication. Data that is being used in content, individual metadata like time of online activities, length of contributions, typing speed. Also checking network characteristics and biosensing such as voice, facial expressions. Some ethical challenges include: contextual integrity (Nissenbaum 2004/2009) privacy expectations are context specific and not as constant rules. Secondly, lack of vulnerability protection – analysis of mental health breach the rights of people to protect their health. Third, potential negative consequences, with impacts on employment, insurance, etc. Finally, the irrelevance of consent – some studies included consent in the development, but what about applying it in the world. We see no informed consent, no opt-out, no content related vulnerability protections, no duty of care and risk mitigation, there is no feedback and the number of participants number is unlimited. All these are in contrast to practices in Human Subjects Research guidelines.
In terms of surveillance, we should think about self-surveillance in which the citizens are providing the details of surveillance yourself. Surveillance is not only negative – but modern approach are not only for negative reasons. There is hoarding mentality of the military-industrial complex.
The area of command responsibility received attention, with discussion of liability and different ways in which courts are treating military versus civilian responsibility.
Panel 3: Algorithmic Governance in Practice
Professor Burkhard Schafer (Edinburgh), ‘Exhibit A – Algorithms as Evidence in Legal Fact Finding’. The discussion about legal aspects can easily go to 1066 – you can go through a whole history. There are many links to medieval law to today. As a regulatory tool, there is the issue with the rule of proof. Legal scholars don’t focus enough on the importance of evidence and how to understand it. Regulations of technology is not about the law but about the implementation on the ground, for example in the case of data protection legislations. In a recent NESTA meeting, there was a discussion about the implications of Big Data – using personal data is not the only issue. For example, citizen science project that show low exposure to emission, and therefore deciding that it’s relevant to use the location in which the citizens monitored their area as the perfect location for a polluting activity – so harming the person who collected data. This is not a case of data protection strictly. How can citizen can object to ‘computer say no’ syndrome? What are the minimum criteria to challenge such a decision? What are the procedural rules of fairness. Have a meaningful cross examination during such cases is difficult in such cases. Courts sometimes accept and happy to use computer models, and other times reluctant to take them. There are issues about the burden of proof from systems (e.g. to show that ATM was working correctly when a fraud was done). DNA tests are relying on computer modelling, but systems that are proprietary and closed. Many algorithms are hidden for business confidentiality and there are explorations of these issues. One approach is to rely on open source tools. Replication is another way of ensuring the results. Escrow ownership of model by third party is another option. Next, there is a possibility to questioning software, in natural language.
Dr Aisling de Paor (DCU), ‘Algorithmic Governance and Genetic Information’ – there is an issue in law, and massive applications in genetic information. There is rapid technological advancement in many settings, genetic testing, pharma and many other aspects – indications of behavioural traits, disability, and more. There are competing rights and interests. There are rapid advances in this area – use in health care, and the technology become cheaper (already below $1000). Genetic information. In commercial settings use in insurance, valuable for economic and efficiency in medical settings. There is also focus on personalised medicine. A lot of the concerns are about misuse of algorithms. For example, the predictive assumption about impact on behaviour and health. The current state of predictability is limited, especially the environmental impacts on expressions of genes. There is conflicting rights – efficiency and economic benefits but challenge against human rights – e.g. right to privacy . Also right for non-discrimination – making decisions on the basis of probability may be deemed as discriminatory. There are wider societal and public policy concerns – possible creation of genetic underclass and the potential of exacerbate societal stigma about disability, disease and difference. Need to identify gaps between low, policy and code, decide use, commercial interests and the potential abuses.
Anthony Behan (IBM but at a personal capacity), ‘Ad Tech, Big Data and Prediction Markets: The Value of Probability’. Thinking about advertising, it is very useful use case to consider what happen in such governance processes. What happen in 200 milliseconds for advertising, which is the standards on the internet. The process of real-time-bid is becoming standardised. Start from a click – the publisher invokes an API and give information about the interactions from the user based on their cookie and there are various IDs. Supply Side Platform open an auction. on the demand side, there are advertisers that want to push content to people – age group, demographic, day, time and objectives such as click through rates. The Demand Side platform looks at the SSPs. Each SSP is connected to hundreds of Demand Side Platforms (DSPs). Complex relationships exist between these systems. There are probability score or engage in a way that they want to engage, and they offer how much it is worth for them – all in micropayment. The data management platform (DMP) is important to improve the bidding. e.g., if they can get information about users/platform/context at specific times places etc is important to guess how people tend to behave. The economy of the internet on advert is based on this structure. We get abstractions of intent – the more privacy was invaded and understand personality and intent, the less they were interested in a specific person but more in the probability and the aggregate. Viewing people as current identity and current intent, and it’s all about mathematics – there are huge amount of transactions, and the inventory become more valuable. The interactions become more diverse with the Internet of Things. The Internet become a ‘data farm’ – we started with a concept that people are valuable, to view that data is valuable and how we can extract it from people. Advertising goes into the whole commerce element.
There are issues with genetics and eugenics. Eugenics fell out of favour because of science issues, and the new genetics is claiming much more predictive power. In neuroscience there are issues about brain scans, which are not handled which are based on insufficient scientific evidence. There is an issue with discrimination – shouldn’t assume that it’s only negative. Need to think about unjustified discrimination. There are different semantic to the word. There are issues with institutional information infrastructure.
We found out that of the simple tools that are available to anyone, and that require little training, NO2 diffusion tubes are very effective. We’ve seen them used as a good sign of the level of pollution, especially from traffic. They sense pollution from diesel vehicles.
We also found that reliable equipment that can measure particulate matter known as PM2.5 (very small dust considered harmful) and other pollutants is expensive – as high as £5000 and more. Unfortunately, low-cost equipment cannot give accurate information that can be used in making a case for action.
With a community investment of £250 we will deliver 10 diffusion tubes and support the creation of a local NO2 map. There are other levels of support to the campaign – including sponsoring a specific piece of equipment.
With a growing emphasis on civil society-led change in diverse disciplines, from International Development to Town Planning, there is an increasing demand to understand how institutions might work with the public effectively and fairly.
Extreme Citizen Science is a situated, bottom-up practice that takes into account local needs, practices and culture and works with broad networks of people to design and build new devices and knowledge creation processes that can transform the world.
In this talk, I discussed the work of UCL Extreme Citizen Science group within the wider context of the developments in the field of citizen science. I covered the work that ExCiteS has already done, currently developing and plans for the future.
On the day before the annual meeting, the afternoon was dedicated to a citizen science safari, with visit to the Parc de la Ciutadella and the nearby coast, learning and trying a range of citizen science projects.
Some of my notes from the meeting day are provided below.
Katrin Vohland (ECSA vice chair) open with noting that we see growing networks at national levels (Austria, Germany) and internationally. She noted that role of ECSA as a networking organisation and draw parallels to transformative social innovation theory which talks about ‘guided expansion’. ECSA can develop into multiple hubs (innovation, urban, ecology etc.) with shared responsibility and potentially distributed secretariat . We can share experiences and work load across the network and find new ways to grow.
Libby Hepburn (Australian Citizen Science Association ACSA) talked about the experience in Australia from two perspectives – personally running the Coastal Atlas of Australia and being involved in ACSA. Starting with the Australian context – the history that it didn’t have many people (20 mil population over space larger than Europe, displacement of aboriginal groups and loss of local knowledge) and impact of weather and climate is important. Only 25% of Australian species have been described. There are lots of introduced species – from rabbits to dung beetles to cane toads, thought there are counter examples such as dung beetles are actually successful as they deal with the impact from hoofed species that were introduced. The development of science in Australia is from the late 19th century. The political approach towards science is complex and changing, but citizen science doesn’t wait for the political environment. The Australian Museum created a project to digitise over 16,000 transcriptions of species. Projects such as Explore the Sea-floor allow people to classify images that are being taken automatically under the sea. Philip Roetman Cat Tracker project is another example, allowing to understand the damage that domestic cats causing to local biodiversity. The atlas of living Australia allow for information sharing and distribution patterns. and additional layers – including likely rainfall. They are starting to develop a citizen science project finder, and starting an association – while keeping links to the other emerging associations and projects. She noted the analysis of the Socientize white paper, OPAL, and other lessons from around the world.
A presentation from the Citi-Sense project explained the need for development of sensor-based on citizens’ observatory community. Some of the products that are ready for use. Starting to have stationary boxes that are becoming possible to produce information about air quality. They have developed the CityAir app which provide to report geolocated perceptions and visualise user community reports. Provide personal and community perceptions. There are ways of integrating the data from the models and perception.
Sven Schade (JRC) talked about the citizen science data flow survey. Received 149 projects. at different scales – from neighbourhood to multi national. The data re-usability is that while 90 projects provide data, the majority do that after embargo.
Daniel Wyler (University of Zürich) talked about the citizen science in universities – an initiative in the University of Zürich – establish citizen science at public research and education bodies, they want to establish the Zürich Citizen Science Centre, and developing two papers – a policy paper about the area, and a set of suggested standards for research universities and science funding bodies.
Josep Parelló talked about creativity and innovation in Barcelona – BCNLAb is collaboration with the city council – providing a hub that allow grass-roots to create activities. Providing open scope – they established a citizen science office and promoting participatory practices in scientific research, enjoy from multipliers of research, sharing resources, having a large base of committed participants, common protocol, data repository. He used inspiration from Michel Callon (2003) Research in the wild concept.
Daniel Garcia talked about the Responsible Research and Innovation Challenges and the linkage to citizen science. RRI includes concept such as CBPR, Science Shops , Open Science. Citizen Science is concerned in the political acceptance to inform policies. There are multiple links between RRI and Citizen Science.
Anne Bowser and Elisabeth Tyson described the Wilson Center commons lab and the emerging legal landscape in the US: the crowdsourcing and citizen science bill of 2015 that is being offered in congress – it’s about educating policy makers to the topic. There was also memo from the Office of Science and technology Policy. The memo asked to have point of contacts for citizen science, secondly standardising metadata and cataloguing citizen science activities. A toolkit was published to assist with the implementation. There is an effort of creating a shared database across the CSA, CitSci.org, SciStarter and other sources. There is value in these database for end users, and also use the database as a research tools.
From the ECSA meeting itself there are several news: ECSA have 84 members from 22 countries 30% individual members, the rest organisational members. New badge for ECSA – you can have a badge that recognise ECSA members. The working group on the principle and standards published the 10 principles of citizen science. The new working group deal with best practice and building capacity. Data working group exploring interoperability, privacy/reliability, and intellectual property rights. The international conference is now in planning in 19-21 May 2016, and there is an emerging social media representation on Instagram and Facebook. The policy group is engaging at EU policy levels, but also noticing international developments in the area of citizen science and policy. Planning policy briefing. Responding to policy consultations, and there are some proposals for areas that ECSA can impact policy. A new working group was suggested to coordinate the work of citizen science facilitators. New members selected to the advisory board: Malene
Bruun (European Environmental Agency), Alan Irwin (Department of Organization at Copenhagen Business School), Michael Søgaard Jørgensen
(DIST, Aalborg University), Roger Owen (Scottish Environment Protection Agency) and Ferdinando Boero (University of Salento).
What follows are my personal reflections from the summit and the themes that I feel are emerging in the area of environmental information today.
When considering the recent ratification of the Sustainable Development Goals or SDGs by the UN Assembly, it is not surprising that they loomed large over the summit – as drivers for environmental information demand for the next 15 years, as focal points for the effort of coordination of information collection and dissemination, but also as an opportunity to make new links between environment and health, or promoting environmental democracy (access to information, participation in decision making, and access to justice). It seems that the SDGs are very much in the front of the mind of the international organisations who are part of the Eye on Earth alliance, although other organisations, companies and researchers who are coming with more technical focus (e.g. Big Data or Remote Sensing) are less aware of them – at least in terms of referring to them in their presentations during the summit.
Beyond the SDGs, two overarching tensions emerged throughout the presentations and discussions – and both are challenging. They are the tensions between abundance and scarcity, and between emotions and rationality. Let’s look at them in turn.
Abundance and scarcity came up again and agin. On the data side, the themes of ‘data revolution’, more satellite information, crowdsourcing from many thousands of weather observers and the creation of more sources of information (e.g. Environmental Democracy Index) are all examples for abundance in the amount of available data and information. At the same time, this was contrasted with the scarcity in the real world (e.g species extinction, health of mangroves), scarcity of actionable knowledge, and scarcity with ecologists with computing skills. Some speakers oscillated between these two ends within few slides or even in the same one. There wasn’t an easy resolution for this tension, and both ends were presented as challenges.
With emotions and scientific rationality, the story was different. Here the conference was packed with examples that we’re (finally!) moving away from a simplistic ‘information deficit model‘ that emphasise scientific rationality as the main way to lead a change in policy or public understanding of environmental change. Throughout the summit presenters emphasised the role of mass media communication, art (including live painting development through the summit by GRID-Arendal team), music, visualisation, and story telling as vital ingredients that make information and knowledge relevant and actionable. Instead of a ‘Two Cultures’ position, Eye on Earth offered a much more harmonious and collaborative linkage between these two ways of thinking and feeling.
Next, and linked to the issue of abundance and scarcity are costs and funding. Many talks demonstrated the value of open data and the need to provide open, free and accessible information if we want to see environmental information used effectively. Moreover, providing the information with the ability of analyse or visualise it over the web was offered as a way to make it more powerful. However, the systems are costly, and although the assessment of the IUCN demonstrated that the investment in environmental datasets is modest compared to other sources (and the same is true for citizen science), there are no sustainable, consistent and appropriate funding mechanisms, yet. Funding infrastructure or networking activities is also challenging, as funders accept the value, but are not willing to fund them in a sustainable way. More generally, there is an issue about the need to fund ecological and environmental studies – it seem that while ‘established science’ is busy with ‘Big Science’ – satellites, Big Data, complex computer modelling – the work of studying ecosystems in an holistic way is left to small group of dedicated researchers and to volunteers. The urgency ad speed of environmental change demand better funding for these areas and activities.
This lead us to the issue of Citizen Science, for which the good news are that it was mentioned throughout the summit, gaining more prominence than 4 years ago in the first summit (were it also received attention). In all plenary sessions, citizen science or corwdsourced geographic information were mentioned at least once, and frequently by several speakers. Example include Hermes project for recording ocean temperatures, Airscapes Singapore for urban air quality monitoring, the Weather Underground of sharing weather information, Humanitarian OpenStreetMap Team work in Malawi, Kathmandu Living Lab response to the earthquake in Nepal, Arab Youth Climate Movement in Bahrain use of iNaturalist to record ecological observations, Jacky Judas work with volunteers to monitor dragonflies in Wadi Wurayah National Park – and many more. Also the summit outcomes document is clear: “The Summit highlighted the role of citizen science groups in supporting governments to fill data gaps, particularly across the environmental and social dimensions of sustainable development. Citizen Science was a major focus area within the Summit agenda and there was general consensus that reporting against SDGs must include citizen science data. To this end, a global coalition of citizen science groups will be established by the relevant actors and the Eye on Earth Alliance will continue to engage citizen science groups so that new data can be generated in areas where gaps are evident. The importance of citizen engagement in decision-making processes was also highlighted. ”
However, there was ambivalence about it – should it be seen as an instrument, a tool to produce environmental information or as a mean to get wider awareness and engagement by informed citizens? How best to achieve the multiple goals of citizen science: raising awareness, educating, providing skills well beyond the specific topic of the project, and democratising decision making and participation? It seem to still be the case that the integration of citizen science into day to day operations is challenging for many of the international organisations that are involved in the Eye on Earth alliance.
Another area of challenging interactions emerged from the need for wide partnerships between governments, international organisations, Non-Governmental Organisations (NGOs), companies, start-ups, and even ad-hoc crowds that respond to a specific event or an issue which are afforded by digital and social network. There are very different speeds in implementation and delivery between these bodies, and in some cases there are chasms that need to be explored – for example, an undercurrent from some technology startups is that governments are irrelevant and in some forms of thinking that ‘to move fast and break things’ – including existing social contracts and practices – is OK. It was somewhat surprising to hear speakers praising Uber or AirBnB, especially when they came from people who familiar with the need for careful negotiations that take into account wider goals and objectives. I can see the wish to move things faster – but to what risks to we bring by breaking things?
With the discussions about Rio Principle 10and the new developments in Latin America, the Environmental Democracy Index, and the rest, I became more convinced, as I’ve noted in 2011, that we need to start thinking about adding another right to the three that are included in it (access to environmental information, participation in decision-making, and access to justice), and develop a right to produce environmental information that will be taken seriously by the authorities – in other words, a right for citizen science. I was somewhat surprised by the responses when I raised this point during the discussion on Principle 10.
Final panel (source: IISD)
Finally, Eye on Earth was inclusive and collaborative, and it was a pleasure to see how open people were to discuss issues and explore new connections, points of view or new ways of thinking about issues. A special point that raised several positive responses was the gender representation in such high level international conference with a fairly technical focus (see the image of the closing panel). The composition of the speakers in the summit, and the fact that it was possible to have such level of women representation was fantastic to experience (making one of the male-only panels on the last day odd!). It is also an important lesson for many academic conferences – if Eye on Earth can, I cannot see a reason why it is not possible elsewhere.
The afternoon of the last day of Eye on Earth included two plenary sessions, and a discussion (for the morning, see this post). The first plenary focused on Remote sensing and location enabling applications:
Taner Kodanaz (digitalglobe) technology that looking out to the sky now allow us to look at the Earth from 400 miles. Digital Global started 14 years with high-resolution satellite imagery – with billions of users a day that rely on online map. In natural disasters, they provide information that helped responding to it. Some examples of accelerating efforts include forest fire, intentional fires – in Global Forest Watch, Digital Globe data is used to monitor fire and deforestation and address it. The work WRI led Indonesia to deal with forest fire. Also showing the Missing Maps and respond to Kathmandu earthquake and other cases.
Anil Kumar (Environment Agency – Abu Dhabi) Abu Dhabi have done conservation effort for a long time. They have special interesting Houbara, Falcons, Scimitar Horn Oryx and several other species. Abu Dhabi was doing wildlife tracking 20 years ago, use satellite tracking to give insights into migratory routes and stopovers to reach agreement about avoiding their hunting during migration, and they’ve done different patterns of use. They also done Habitat mapping using satellite information with field verification checking that the classification works. Local ability to create classification of different habitats made it possible to share it, digitally and on paper. Allow protecting areas, follow national and international obligations, improve governance and even for emergency response and accurate blue carbon information. They also map local forestation. They have an environmental portal and share the information.
Lian Pin Koh (Conservation Drones) the idea to have be able to monitor nests of Orang-utan which are difficult to monitor from the ground. Because commercial drones are expensive, he was involved in creating a DIY drone in 2012, based on toy plane and programme the route, with simple camera. This enable them to create attention from conservation groups and community scientists. Conservation Drones started as a project and done many places. They have manage to use it for a wide range of projects and shared their experience. The drone is cheap – $700 and allow repeat monitoring, and also identifying illegal logging. Reaching 1-2 cm resolution. Also used in disaster relief in a case of flood from a busted dam that happened during forest monitoring. Attitude to ConservationDrones.org changed rapidly, from ridicule to excitement, and now they are involved in exploring mapping how to quantify biomas – fuel load and control burns. The issues about drones is to create actionable information.
Justin Saunders (eMapsite) – Malawi experience an incredible rainfall, with 200,000 displaced. Rapid response don’t happen until it reach the news – but it didn’t received much attention. They received radar imagery. They used the UN Charter to gain access to the radar imagery that helped to respond to the places that were flooded. They could see the inundation, and also use a flood model to see how realistic was it. Climate change exceeded all the assumptions – including one in 500 years. In Malawi, there isn’t information about the building and community assets. They have worked with OpenStreetMap, carrying out community mapping following the practices of Open Cities and this allow the support of many relief organisations – supporting. Also used the Masdap.mw system that is the Malawi Spatial Data Portal (based on open source) and that allow sharing information. Only one platform help to ensure sharing. Use crowdsourcing before, during and after the event – they are aware that with climate change it will exceed historical records. Use of open source software encourage people to train, and improved the flood modelling. Institutions take new technology, data and methodology rapidly – especially when it was free and not require investment. Visualisation helped action.
Steven Ramage (What3Words) – there are 135 countries that don’t have addressing information, and the Universal Postal Union, this is very valuable. There are four billion people without location reference. Allow creating a digital location reference in 3 words in places that are informal and don’t have addressing system. There are 860,000 people in informal settlements – how do we communicate the location. Instead of lat/long but when you need to communicate between people, creating 3 words key to the place. The system is small – 10MB and can work without connectivity, and there is research that demonstrate that words are easier to remember then numbers. Long words are to less populated area and there is new dictionary for each language, enabling to integrate into indigenous languages. Started to be used by esri, nestoria, UN, Safe Software, Mapillary, GoCarShare. Used in the Nepal earthquake, in delivery of medicine in informal settlement, UNOCHA suggest using what3words.
The final set of talks was titled Feet in the field chaired by Stuart Parerson (Conservation Leadership Programme) exploring volunteering programmes. He noted that the questions for the session were: How do we build capacity to collect primary data? How do we make people future conservation leaders? How do we communicate with policy makers? The Feet in the Field is aimed to support future conservation leaders. They have 6 key stages process of identifying and promoting young leaders. There i a need of investment and attention to maintain diversity.
David Kuria (KENVO) Kijabe Environment volunteers – explore conservation and livelihood – founded in 1994 in Kikuyu Escarpment Forest. They do education but also community empowerment. They observed forest degradation – illegal logging, over grazing and also breakdown of social systems. Knowledge and skills that gained locally and through NGOs, and then use that to mobilise the community, lobbying, but also patrolling and monitoring. They have done different studies – poaching, bird surveys, forest monitoring, as well as climate change and carbon trading. The data is used to action – e.g. encouraging ecotourism, or capacity building of many farmers. Data is important for decision makers and a strong tool for conservation awareness – and fosters support. But more important is the human side – good leadership, motivation and engagement, respecting existing systems, owned by stakeholders, working with marginalised groups. Many challenges: technical capacity, resources, high turn over of government staff, limited ability in volunteering, vast area and more.
Alberto Campos (Aquasis in Brazil) – 21 years preventing extinction in Brazil – based in Fortaleza, and they look after highly endangered marine mammals and birds. The have emergency plan and action plan – to do that they need long term plan. The problem is that they need long team funding, conservation & fieldwork training – and they been receiving support from the CLP). Systems that they developed been adopted by the government. Communicating these results is shifting focus for conservation of species to the resources they help to conserve. Biodiversity conservation is opening other resource – Manakin is becoming indicator to clean and accessible water – and that help to recognise them
Ayesha Yousef Al Blooshi (Marine Biodiversity at EAAD) – primary producers of environmental data, EAD produce data, then pass it to environmental management sector, that is use by government and then share it with the world. They been monitoring corals undersea and take photo transects that are analysed – it’s a very manual process that take a lot of time. They think about using CoralNet that use machine learning to recognised species. The sea grass is supporting the population of Dugongs, and monitor them from the air. They also track them and use drone technology to monitor dolphins. They have a collector app that allow them to record different sightings which speed up and simplify data collection. They also gather traditional knowledge from fishermen – also looking at the past and capture wealth of data.
Nicolas Heard – funds from the Mohamed bin Zayed conservation fund. They like people who are passionate about species. They can show how the small grant can be used to further the cause of their species. The passion need to be matched with science – also important to pass on enthusiasm to local communities, but that is not enough. Need data, information, knowledge, skills and collaboration. They provide small grants for survey and monitoring and encourage contribution of data to other purposes. Help support outreach, prioritising conservation action, help in efficiency
Jacky Judas (Wadi Wuraya National Park) in the eastern coast of the UAE. The park was created in 2009 and made into RAMSAR site in 2010, aiming to develop management plan. The water research programme are education, awareness and scientific data. The participants learn about fresh water ecosystems and the challenges, and also learn how to monitor the ecosystem. 10-15 volunteers through EarthWatch, research activities include Toad monitoring – field data collection, lab experiment, data input. Also monitoring dragonflies (hot spot for them in the area) and discovered a species that was never spotted in the UAE. Working with volunteers allow monitoring over the season, the use iNaturalist and help to GBIF
Jean-Christophe Vié (IUCN) have tradition of looking at primary data collection. Behind each assessment in the 70,000 species in the Red List, there is at least on person working on the ground. The created the habitat conservation programme allow them to support primary data collection. Species are good way to tell stories. Projects such as Save our Species help in understanding distribution of species and then identify key areas to provide support for conservation. They ask to have some monitoring information to understand what is the impact of investment.
Summary of the session: We need capacity of research; data must lead to action; show how species help to protect other resources; combine traditional and scientific knowledge; and realise that small funding can go long way with volunteers.
Once that part was completed, we moved to the summary of the summit.
H.E. Razan Khalifa Al Mubarak, Jacqueline McGlade, Barbara Ryan, Janet Ranganathan and Thomas Brooks . Nima Abu-Wardeh, who moderated the whole summit, set questions to the panel: How do you all fit together? Razan: we find ways to fit together – regions are represented, there are many positive things happen in the Arab region and share them. Barbara: no one organisation can deal with environmental problem alone, the power is coming together from public, private and civil society – all need to work together, and there are challenges of changing our internal systems, bridge the transition from data to wisdom, we need to do that. Thomas: IUCN fit in to EoE through the power of the network of public bodies, 1000 civil society members, and more than 10000 experts, Janet: WRI – trying to scale things through counting and present it in an engaging way. Jacquie – what is important is to representing the UN family making poor and vulnerable heard. To address environmental problems, we need the Eye on Earth alliance, this is the way to reach out across the world. What are the tools and mechanisms that people need – how ‘how am I going to do it?’ is going to happen. Jacquie: provides a web intelligence information from UNEP Live, we can see how clusters of knowledge are being built up. Things are linked to other places across the world and letting citizens influence the agenda. Razan: need to synchronise elevator – one with policy makers that need the data and another one with scientists who are producing the data. We need to synchronicity that change in each region according to need. People can completely bypass the system in many ways, but what happen if policy makers take too much time, and the needs are urgent – what will happen after the event? Jacquie: we are suggested activities that are dealing with foundational – global network or networks, environmental education, access for all and then link to thematic areas – biodiversity, disaster management, community sustainability and resilience, oceans and blue carbon, and water security. Barbara: the organisations that we are involved in – we need to think how our activities that already exist with identifying the themes. Thomas: IUCN can contribute the knowledge products to the range of Eye on Earth products, and advocate for mechanisms to develop capacity to generate data. Janet: contributing data platforms – resource watch, forest watch and on access for all. the Environment Democracy Index came out of EoE.
How do we do things better? there is much ground to cover and stimulating change. Barbara: for partnerships to work, it got to align with our own vision. The partnership let us do that. Advocacy for broad open data policies – we need to get on with it. Jacquie: we need to bring Principle 10 to the UN. We need to open up governmental debate, we grab participation by the neck and make it central to what we do. We have big environmental assembly. We need data that inform. Barbara: the capabilities of citizen science and citizen sensing was front and centre and that is central.
We need to talk with the media and behaviour change, broadening our horizons.
Razan – we converge and collaborate. We came from all regions of the world and walks of life. Some are affiliated with government, research, start up, companies, ecologists and environmentalists. Many here were here in 2011. Thanking for signaling the value in the eye on earth network. Developing a strong sense of community, aiming to solve major problems of the planet. We see sense of purpose in assisting the monitoring and progress towards the SDGs. We have 5 organisations that commit to be founding members of the organisations: AGEDI, IUCN, WRI, UNEP and GEO. They commit to develop assist and guide global community to achieve the SDGs. Eye on Earth can provide collective voice – it is informal alliance, and agreeing to convene Eye on Earth again.