Leveraging the power of place in citizen science for effective conservation decision making – new paper

During the Citizen Science conference in 2015, a group of us, under the enthusiastic encouragement of John Gallo started talking about a paper that will discuss the power of place in citizen science. John provides a very detailed account about the way that a discussion and inspiration during the conference led to the development of the paper. Greg Newman took the lead on the process of writing, and the core analysis was based on classifying and analysing 134 citizen science projects.

My contribution to the paper is mostly in exploration of the concept of place including the interpretation within Human Geography of places as spaces of flows (so the paper cites Doreen Massey). I was also involved in various discussion about the development of the dimensions of place that were included in the analysis, while most of the work was done by Greg Newman, Bridie McGreavy  & Marc Chandler.

The paper is now out and free to read and reuse.

Place-based citizen science framework (a) before and (b) after leveraging the power of place. Note that after leveraging the power of place, the citizen science circle is enlarged to reflect a potential increase in participation, data collection, and quality of conservation decision making and that the overall influence of decision making also grew. Note also that the relative size of Zone One increased while the inherent capacity of the power of place remained the same size.
Place-based citizen science framework (a) before and (b) after leveraging the power of place. Note that after leveraging the power of place, the citizen science circle is enlarged to reflect a potential increase in participation, data collection, and quality of conservation decision making and that the overall influence of decision making also grew. Note also that the relative size of Zone One increased while the inherent capacity of the power of place remained the same size.








While it is, for me, expected that place will have an important role in citizen science, it is excellent to see that the analysis supported this observation through consistent classification of citizen science projects across three collections. The model above suggest how it can be used.

The paper development process, however, demonstrate the power of cyberspace, as the team met regularly online and shared documents, details and drafts along the way, with important regular online meeting that help it to come together. The paper started with all of us at the same place and at the same time, but this interaction was enough to sustain our team work all the way to publication.

The paper is open access and the abstract for it is:

Many citizen science projects are place-based – built on in-person participation and motivated by local conservation. When done thoughtfully, this approach to citizen science can transform humans and their environment. Despite such possibilities, many projects struggle to meet decision-maker needs, generate useful data to inform decisions, and improve social-ecological resilience. Here, we define leveraging the ‘power of place’ in citizen science, and posit that doing this improves conservation decision making, increases participation, and improves community resilience. First, we explore ‘place’ and identify five place dimensions: social-ecological, narrative and name-based, knowledge-based, emotional and affective, and performative. We then thematically analyze 134 case studies drawn from CitSci.org (n = 39), The Stewardship Network New England (TSN-NE; n = 39), and Earthwatch (n = 56) regarding: (1) use of place dimensions in materials (as one indication of leveraging the power of place), (2) intent for use of data in decision-making, and (3) evidence of such use. We find that 89% of projects intend for data to be used, 46% demonstrate no evidence of use, and 54% provide some evidence of use. Moreover, projects used in decision making leverage more (t = − 4.8, df = 117; p < 0.001) place dimensions (View the MathML source= 3.0; s = 1.4) than those not used in decision making (View the MathML source= 1.8; s = 1.2). Further, a Principal Components Analysis identifies three related components (aesthetic, narrative and name-based, and social-ecological). Given these findings, we present a framework for leveraging place in citizen science projects and platforms, and recommend approaches to better impart intended outcomes. We discuss place in citizen science related to relevance, participation, resilience, and scalability and conclude that effective decision making as a means towards more resilient and sustainable communities can be strengthened by leveraging the power of place in citizen science.

Reading ‘Citizen Scientist: Searching for Heroes and Hope in an Age of Extinction’ in place

9781615192434At the beginning of the year, I received an email from Mary Ellen Hannibal, asking for a clarification of the ‘extreme citizen science’ concept. Later on, Mary Ellen provided me with an early copy of ‘Citizen Scientist: Searching for Heroes and Hope in an Age of Extinction‘, and asked if I will be willing to recommend it. I read the first part of the book before travelling to Sci Foo Camp, and was happy to provide a statement (I wouldn’t overstate the value of my endorsement when she received ones from Bill McKibben and Paul Ehrlich).

The part that I read captured my interest, and I finished reading it on the way to Sci Foo and shortly after it. I’ve enjoyed reading it, and at many points I stopped to think and absorb the rich information that Mary Ellen provided within it. At the beginning, I was expecting an account of the personal experience of doing citizen science and understanding its place in the world – much like Sharman Apt Russell ‘Diary of a Citizen Scientist’ (a wonderful book which I highly recommend!). However ‘Citizen Scientist’ is a very different type of book, with a much richer internal ‘ecology’. The book is weaving five themes – the impact of the mass extinction that we are experiencing around us; a very personal account of losing a parent; the history and development of ecological knowledge of coastal California; Joseph Campbell’s literary framework of the ‘hero’s journey’, and the way it can be linked to John Steinbeck and Ed Rickets work around Monterey; and the current practice of citizen science, especially around the Bay Area and coastal California. These themes are complex on their own, and Mary Ellen is doing a great job in exploring each one of them and bringing them into interaction with each other. As I went through the book, each of these was explained clearly from a well researched position, with the experiential aspects of citizen science – including the frustration and challenges – beautifully expressed. As you read through the book, you start to see how these themes come together. It most be said that most of these themes are worrying or raise the notion of loss. Against this background, citizen science plays the role of ‘hope’ at the corner of Pandora’s box – offering a way to connect to nature, nurture it and redevelop a sense of stewardship. A way to preserve the cultural practices of the Amah Mutsun tribe, nature, and a sense of connection to place.

Near Yosemite I felt very lucky that Mary Ellen got in touch and shared the book with me – it was just the right book for me to read at the time. After the Sci Foo Camp, I have stayed in central California for 4 weeks, touring from Mountain View in the Bay Area, to Ripon in Central Valley, to Oak View in Ojai Valley, near Ventura and Los Angeles. Reading the book while travelling through places that are linked to the book gave the visits deeper and richer context and meaning. Many of the encounters throughout journey were linked to the topics that I mentioned above – you don’t need to be any kind of hero to experience these! Some of these encounters include the following.
DSCN1924First was the fascinating session at Sci Foo Camp, in which Tony Barnosky discussed the issue of global tipping points (which are discussed in the book) and their wider implications, with few days later travelling towards Yosemite and experiencing the change in very large landscapes following fires and thinking ‘is this a local ecological tipping point, and the forest won’t come back?’. Then there was a visit to San Francisco Golden Gate Park, and passing by the California Academy of Sciences (Cal Academy, the San Francisco Natural History Museum), whose story is covered in the book. Another reminder of extinction came while travelling down the famous California State Route 1, which was eerily quite and empty of other cars on a weekend day, because of the Soberanes Fire that was devastating the forest nearby (and has not stopped). Or stopping by the Mission in Santa Barbara and thinking about the human and natural history of the coast, or just looking at the kelp on the beach and appreciating it much more…

I’ll try to write more about citizen science and its hopeful aspects later, but as for the book – even if you don’t travel through coastal California, I am happy with what I’ve said about it: ‘an informative, personal, emotional and fascinating account of a personal journey to ecological citizen science. It shows how our understanding of our environment and the need for urgent action to address the mass extinction that is happening in front of our eyes can be addressed through participatory science activities’.

Science Foo Camp 2016

Science Foo Camp (SciFoo) is an invitation based science unconference that is organised by O’Reilly media, Google, Nature, and Digital Science. Or put it another way, a weekend event (from Friday evening to Sunday afternoon), where 250 scientists, science communicators and journalists, technology people from area that relate to science, artists and ‘none of the above’ come and talk about their interests, other people interests, and new ideas, in a semi-structured way.

As this is an invitation only event, when I got the invitation, I wasn’t sure if it is real – only to replace this feeling with excitement after checking some of the information about it (on Wikipedia and other sites). I was also a little bit concerned after noticing how many of the participants are from traditional natural science disciplines, such as physics, computer science, neuroscience, chemistry, engineering and such (‘Impostor syndrome‘). However, the journey into citizen science, since 2010 and the first Citizen Cyberscience Summit, have led me to fascinating encounters in ecological conferences, physicists and environmental scientists, synthetic biologists, epidemiologists, and experimental physicists, in addition to links to Human-Computer Interaction researchers, educational experts, environmental policy makers, and many more. So I hoped that I could also communicate with the scientists that come to SciFoo.

I was especially looking forward to see how the unconference is organised and run. I’ve experienced unconferences (e.g. WhereCampEU in 2010, parts of State of the Map) and organised the Citizen Cyberscience Summits in 2012 & 2014 where we meshed-up a formal academic conference with unconference. I was intrigued to see how it works when the O’Reilly media team run it, as they popularised the approach.

The event itself run from the evening of Friday to early afternoon on Sunday, with very active 45 hours in between.

wp-1469243960730.jpgThe opening of the event included the following information (from Sarah Winge, Cat Allman, Chris DiBona, Daniel Hook, and Tim O’Reilly): The Foo Camp is an opportunity to bunch of really interesting people to get together and tell each other interesting stories – talk about the most interesting story that you’ve got. The main outputs are new connections between people. This as an opportunities to recharge and to get new ideas – helping each person to recharge using someone else battery. The ground rules include: go to sessions outside your field of expertise – an opportunity to see the world from a different perspective; be as extroverted as you can possibly be – don’t sit with people that you know, as you’ll have a better weekend to talk to different people. The aim is to make a conference that is made mostly from breaks – it’s totally OK to spend time not in a session; the law of two feet – it’s OK to leave and come from sessions and coming and going. It’s a DIY event. There are interesting discussions between competitors commercially, or academically – so it is OK to say that part of the conversations will be kept confidential.

wp-1469414697362.jpgThe expected scramble to suggest sessions and fill the board led to a very rich programme with huge variety – 110 sessions for a day and a half, ranging from ‘Origami Innovations’, ‘Are there Global Tipping Points?’, to ‘Growth Hacking, Rare disease R&D’, and ‘What we know about the universe? and what we don’t know?’. Multiple sessions explored Open science (open collaborations, reproducibility, open access publication), issues with science protocols, increasing engagement in science, gender, social justice side by side with designer babies, geoengineering, life extension, artificial intelligence and much more.

In addition, several curated sessions of lightning talks (5 minutes rapid presentations by participants), provided a flavour and extent of the areas that participants cover. For example, Carrie Partch talk about understanding how circadian cycles work – including the phenomena of social jet-lag, with people sleeping much more at weekends to compensate for lack of sleep during the weekdays. Or Eleine Chew demonstrated her mathematical analysis of different music performances and work as concert pianist.

I’ve followed the advice from Sarah, and started conversation with different people during meals, or on the bus to and from SciFoo, or while having coffee breaks. Actually everyone around was doing it – it was just wonderful to see all around people introducing themselves, and starting to talk about what they are doing. I found myself learning about research on common drugs that can extend the life of mice, brain research with amputees, and discussing how to move academic publications to open access (but somehow ending with the impact of the cold war on the investment in science).

I have organised a session about citizen science, crowdsourcing and open science, in which the discussion included questions about science with monks in Tibet, and patient active involvement in research about their condition. I’ve joined two other sessions about ‘Making Science Communication Thrilling for the Lay Person‘ with Elodie Chabrol (who run Pint of Science) and Adam Davidson; and ‘Science Communication: What? What? How? Discuss‘ with Suze Kundu, Jen Gupta, Simon Watt & Sophie Meekings. Plenty of ideas (and even a sub-hashtag to get responses for specific questions) came from these sessions, but also realisation of the challenges for early career academics in developing their skills in this area, with discouraging remarks from more senior academics, and potential career risks – so we also dedicated thinking about appropriate mechanisms to support public engagement activity.

Another fantastic discussion was led by Kevin Esvelt about ‘Better than nature: ethics of ecological engineering‘ – when this involve gene editing with techniques such as CRISPR with potential far reaching impact on ecological systems. This session just demonstrated how valuable it is to have interdisciplinary conference where the expertise of the people in the room range from geoengineering to ecology and ethics. It was also a mini-demonstration of Responsible Research and Innovation (RRI) in action, where potential directions of scientific research are discussed with a range of people with different background and knowledge.

The amount of input, encounters and discussion at SciFoo is overwhelming, and the social activities after the sessions (including singing and sitting by the ‘fire’) is part of the fun – though these were very exhausting 40 hours.

Because SciFoo invitees include a whole group of people from science communication, and as SciFoo coincide with Caren Cooper stint of the twitter account @IamSciComm account where she discussed the overlap between citizen science and science communication, I paid attention to the overlap during the meeting. The good news is that many of the scientists had some idea of what citizen science is. I always check that people know the term before explaining my work, so it’s great to see that term is gaining traction. The less good news is that it is still categorised under ‘science communication’ and maybe a useful session would have been ‘What is the problem of scientists with citizen science?’.


For me, SciFoo raised the question about the value of interdisciplinary meetings and how to make them work. With such a list of organisers, location, exclusiveness and the mystery of invitation (several people, including me, wonder ‘It’s great being here, but how did they found out about my work?’) – all make it possible to get such an eclectic collection of researchers. While it’s obvious that the list is well curated with considerations of research areas, expertise, background, academic career stage, and diversity, the end results and the format open up the possibility of creative and unexpected meetings (e.g. during lunch). My own experience is that to achieve something that approach such a mix of disciplines in a common ‘bottom-up’ academic conference is very challenging and need a lot of work. The Citizen Cyberscience summits, ECSA conference, or the coming Citizen Science Association conference are highly interdisciplinary in terms of the traditional academic areas from which participant come – but they require to convince people to submit papers and come to the conference. Usually, the interdisciplinary event is an additional commitment to their disciplinary focus and this creates a special challenge. Maybe it can be possible to achieve similar interdisciplinary meetings by getting endorsements from multiple disciplinary societies, or get support from bodies with wide remit like the Royal Society and Royal Academy of Engineering.

Another thought is that the model of reaching out to people and convincing them that it is worth their while to come to such a meeting might also work better in allowing mixing, as open call are impacted by ‘self deselection’ where people decide that the conference is not for them (e.g. getting active participants to a citizen science conference, or ensuring that papers are coming from all flavours of citizen science).

Another delightful aspect is to notice how the unconference format worked with people that (mostly) haven’t experienced it before – the number of slots and opportunities was enough for people to mostly put their sessions forward. Although the call for people to be extroverts, the people with less confident will prepare their ideas more slowly, and can end up outside the grid. It was nice to see how some places in the grid were blocked off during the early stages, and then release to ideas that came during breaks, or for sessions that were proposed more slowly and didn’t secure a spot. There might be also value in restricting people to one session, and then progressing to more? What are the steps that are required to make an unconference format inclusive at the session setting stage?

In contrast to the approach in academic meetings to control the number of parallel sessions (to ensure enough people are showing up to a session), SciFoo is having so many, that most of the sessions are with a small group of about 10 or 20 people. This make it more valuable and suitable for exploratory discussions – which worked well in the sessions that I attended. In a way, at its best, SciFoo is many short brain storming sessions which leave you with a wish to discuss for longer.

If you get an invitation (and being flattered is part of the allure of SciFoo), it is worth going on the Wiki, give a bit of a description of yourself and think about a session that you’d like to propose – +1 can help you to get a feeling that people will be interested in it. Think about a catchy title that includes keywords, and remember that you are talking to intelligent lay people from outside your discipline, so prepare to explain some core principles for the discussion in 5 minutes or so. Don’t dedicate the time to tell people only about your research – think of an issue that bother you to some degree and you want to explore (for me it was the connection between citizen science and open science) and consider that you’ll have one hour to discuss it.

Follow the advice – say hello to everyone and have great conversations during breaks, and don’t go to sessions if the conversation is more interesting. Another take on the meeting is provided by Bjoern Brembs on his blog, with whom I had the open access conversation (and I still unsure how we ended with the Cold War).  Also remember to enjoy the experience, sit by the ‘fire’ and talk about things other than science!



New Paper: The Three Eras of Environ-mental Information: the Roles of Experts and the Public

Since the first Eye on Earth conference in 2011, I started thinking that we’re moving to a new era in terms of relationships between experts and the public in terms of access to environmental information and it’s production. I also gave a talk about this issue in the Wilson Center in 2014. The three eras can be summarised as ‘information for experts by experts’,’information for experts and the public, by experts, and in experts language’, and ‘information for experts and the public, by experts and the public, in multiple forms’.

Finally, as part of a book that summarises the outcomes from the EveryAware project, I’ve written a chapter that explores the three eras of environmental information and provide a more detailed account of each of them.  You can access the paper here and it should be cited at

Haklay, M., 2017, The Three Eras of Environ-mental Information: The Roles of Experts and the Public, In Loreto, V., Haklay, M., Hotho, A., Servedio, V.C.P, Stumme, G., Theunis, J., Tria, F. (eds.) Participatory Sensing, Opinions and Collective Awareness. Springer. pp.163-179.

The book includes many other chapters and I’ll put several of them online later in the year. you can find the book on Springer site.

Algorithmic governance in environmental information (or how technophilia shape environmental democracy)

These are the slides from my talk at the Algorithmic Governance workshop (for which there are lengthy notes in the previous post). The workshop explored the many ethical, legal and conceptual issues with the transition to Big Data and algorithm based decision-making.

My contribution to the discussion is based on previous thoughts on environmental information and public use of it. Inherently, I see the relationships between environmental decision-making, information, and information systems as something that need to be examined through the prism of the long history that linked them. This way we can make sense of the current trends. This three area are deeply linked throughout the history of the modern environmental movement since the 1960s (hence the Apollo 8 earth image at the beginning),  and the Christmas message from the team with the reference to Genesis (see below) helped in making the message stronger .

To demonstrate the way this triplet evolved, I’m using texts from official documents – Stockholm 1972 declaration, Rio 1992 Agenda 21, etc. They are fairly consistent in their belief in the power of information systems in solving environmental challenges. The core aspects of environmental technophilia are summarised in slide 10.

This leads to environmental democracy principles (slide 11) and the assumptions behind them (slide 12). While information is open, it doesn’t mean that it’s useful or accessible to members of the public. This was true when raw air monitoring observations were released as open data in 1997 (before anyone knew the term), and although we have better tools (e.g. Google Earth) there are consistent challenges in making information meaningful – what do you do with Environment Agency DSM if you don’t know what it is or how to use a GIS? How do you interpret Global Forest Watch analysis about change in tree cover in your area if you are not used to interpreting remote sensing data (a big data analysis and algorithmic governance example)? I therefore return to the hierarchy of technical knowledge and ability to use information (in slide 20) that I covered in the ‘Neogeography and the delusion of democratisation‘ and look at how the opportunities and barriers changed over the years in slide 21.

The last slides show that despite of all the technical advancement, we can have situations such as the water contamination in Flint, Michigan which demonstrate that some of the problems from the 1960s that were supposed to be solved, well monitored, with clear regulations and processes came back because of negligence and lack of appropriate governance. This is not going to be solved with information systems, although citizen science have a role to play to deal with the governmental failure. This whole sorry mess and the re-emergence of air quality as a Western world environmental problem is a topic for another discussion…

Algorithmic Governance Workshop (NUI Galway)

Algorithmic Governance Workshop (source: Niall O Brolchain)

The workshop ‘Algorithmic Governance’ was organised as an intensive one day discussion and research needs development. As the organisers Dr John Danaher
and Dr Rónán Kennedy identified:

‘The past decade has seen an explosion in big data analytics and the use  of algorithm-based systems to assist, supplement, or replace human decision-making. This is true in private industry and in public governance. It includes, for example, the use of algorithms in healthcare policy and treatment, in identifying potential tax cheats, and in stopping terrorist plotters. Such systems are attractive in light of the increasing complexity and interconnectedness of society; the general ubiquity and efficiency of ‘smart’ technology, sometimes known as the ‘Internet of Things’; and the cutbacks to government services post-2008.
This trend towards algorithmic governance poses a number of unique challenges to effective and legitimate public-bureaucratic decision-making. Although many are already concerned about the threat to privacy, there is more at stake in the rise of algorithmic governance than this right alone. Algorithms are step-by-step computer coded instructions for taking some input (e.g. tax return/financial data), processing it, and converting it into an output (e.g. recommendation for audit). When algorithms are used to supplement or replace public decision-making, political values and policies have to be translated into computer code. The coders and designers are given a set of instructions (a project ‘spec’) to guide them in this process, but such project specs are often vague and underspecified. Programmers exercise considerable autonomy when translating these requirements into code. The difficulty is that most programmers are unaware of the values and biases that can feed into this process and fail to consider how those values and biases can manifest themselves in practice, invisibly undermining fundamental rights. This is compounded by the fact that ethics and law are not part of the training of most programmers. Indeed, many view the technology as a value-neutral tool. They consequently ignore the ethical ‘gap’ between policy and code. This workshop will bring together an interdisciplinary group of scholars and experts to address the ethical gap between policy and code.

The workshop was structured around 3 sessions of short presentations of about 12 minutes, with an immediate discussion, and then a workshop to develop research ideas emerging from the sessions. This very long post are my notes from the meeting. These are my takes, not necessarily those of the presenters. For another summery of the day, check John Danaher’s blog post.

Session 1: Perspective on Algorithmic Governance

Professor Willie Golden (NUI Galway)Algorithmic governance: Old or New Problem?’ focused on an information science perspective.  We need to consider the history – an RO Mason paper from 1971 already questioned the balance between the decision-making that should be done by humans, and that part that need to be done by the system. The issue is the level of assumptions that are being integrated into the information system. Today the amount of data that is being collected and the assumption on what it does in the world is a growing one, but we need to remain sceptical at the value of the actionable information. Algorithms needs managers too. Davenport in HBR 2013 pointed that the questions by decision makers before and after the processing are critical to effective use of data analysis systems. In addition, people are very concerned about data – we’re complicit in handing over a lot of data as consumers and the Internet of Things (IoT) will reveal much more. Debra Estrin 2014 at CACM provided a viewpoint – small data, where n = me where she highlighted the importance of health information that the monitoring of personal information can provide baseline on you. However, this information can be handed over to health insurance companies and the question is what control you have over it. Another aspect is Artificial Intelligence – Turing in 1950’s brought the famous ‘Turing test’ to test for AI. In the past 3-4 years, it became much more visible. The difference is that AI learn, which bring the question how you can monitor a thing that learn and change over time get better. AI doesn’t have self-awareness as Davenport 2015 noted in Just How Smart are Smart Machines and arguments that machine can be more accurate than humans in analysing images. We may need to be more proactive than we used to be.

Dr Kalpana Shankar (UCD), ‘Algorithmic Governance – and the
Death of Governance?’ focused on digital curation/data sustainability and implication for governance. We invest in data curation as a socio-technical practice, but need to explore what it does and how effective are current practices. What are the implications if we don’t do ‘data labour’ to maintain it, to avoid ‘data tumbleweed. We are selecting data sets and preserving them for the short and long term. There is an assumption that ‘data is there’ and that it doesn’t need special attention. Choices that people make to preserve data sets will influence the patterns of  what appear later and directions of research. Downstream, there are all sort of business arrangement to make data available and the preserving of data – the decisions shape disciplines and discourses around it – for example, preserving census data influenced many of the social sciences and direct them towards certain types of questions. Data archives influenced the social science disciplines – e.g. using large data set and dismissing ethnographic and quantitative data. The governance of data institutions need to get into and how that influence that information that is stored and share. What is the role of curating data when data become open is another question. Example for the complexity is provided in a study of a system for ‘match making’ of refugees to mentors which is used by an NGO, when the system is from 2006, and the update of job classification is from 2011, but the organisation that use the system cannot afford updating and there is impacts on those who are influenced by the system.

Professor John Morison (QUB), ‘Algorithmic Governmentality’. From law perspective, there is an issue of techno-optimism. He is interested in e-participation and participation in government. There are issue of open and big data, where we are given a vision of open and accountable government and growth in democratisation – e.g. social media revolution, or opening government through data. We see fantasy of abundance, and there are also new feedback loops – technological solutionism to problems in politics with technical fixes. Simplistic solutions to complex issues. For example, an expectation that in research into cybersecurity, there are expectations of creating code as a scholarly output. Big Data have different creators (from Google to national security bodies) and they don’t have the same goals. There is also issues of technological authoritarianism as a tool of control. Algorithmic governance require to engage in epistemology, ontology or governance. We need to consider the impact of democracy – the AI approach is arguing for the democratisation through N=all argument. Leaving aside the ability to ingest all the data, what is seemed to assume that subjects are not viewed any more as individuals but as aggregate that can be manipulated and act upon. Algorithmic governance, there is a false emancipation by promise of inclusiveness, but instead it is responding to predictions that are created from data analysis. The analysis is arguing to be scientific way to respond to social needs. Ideas of individual agency disappear. Here we can use Foucault analysis of power to understand agency.  Finally we also see government without politics – arguing that we make subjects and objects amenable to action. There is not selfness, but just a group prediction. This transcend and obviates many aspects of citizenship.

Niall O’Brolchain (Insight Centre), ‘The Open Government’. There is difference between government and governance. The eGov unit in Galway Insight Centre of Data Analytics act as an Open Data Institute node and part of the Open Government Partnership. OGP involve 66 countries, to promote transparency, empower citizens, fight corruption, harness new technologies to strengthen governance. Started in 2011 and involved now 1500 people, with ministerial level involvement. The OGP got set of principles, with eligibility criteria that involve civic society and government in equal terms – the aim is to provide information so it increase civic participation, requires the highest standards of professional integrity throughout administration, and there is a need to increase access to new technologies for openness and accountability. Generally consider that technology benefits outweigh the disadvantages for citizenship. Grand challenges – improving public services, increasing public integrity, public resources, safer communities, corporate accountability. Not surprisingly, corporate accountability is one of the weakest.


Using the Foucault framework, the question is about the potential for resistance that is created because of the power increase. There are cases to discuss about hacktivism and use of technologies. There is an issue of the ability of resisting power – e.g. passing details between companies based on prediction. The issue is not about who use the data and how they control it. Sometime need to use approaches that are being used by illegal actors to hide their tracks to resist it.
A challenge to the workshop is that the area is so wide, and we need to focus on specific aspects – e.g. use of systems in governments, and while technology is changing. Interoperability.  There are overlaps between environmental democracy and open data, with many similar actors – and with much more government buy-in from government and officials. There was also technological change that make it easier for government (e.g. Mexico releasing environmental data under OGP).
Sovereignty is also an issue – with loss of it to technology and corporations over the last years, and indeed the corporate accountability is noted in the OGP framework as one that need more attention.
There is also an issue about information that is not allowed to exists, absences and silences are important. There are issues of consent – the network effects prevent options of consent, and therefore society and academics can force businesses to behave socially in a specific way. Keeping of information and attributing it to individuals is the crux of the matter and where governance should come in. You have to communicate over the internet about who you are, but that doesn’t mean that we can’t dictate to corporations what they are allowed to do and how to use it. We can also consider of privacy by design.

Session 2: Algorithmic Governance and the State

Dr Brendan Flynn (NUI Galway), ‘When Big Data Meets Artificial Intelligence will Governance by Algorithm be More or Less Likely to Go to War?’. When looking at autonomous weapons we can learn about general algorithmic governance. Algorithmic decision support systems have a role to play in very narrow scope – to do what the stock market do – identifying very dangerous response quickly and stop them. In terms of politics – many things will continue. One thing that come from military systems is that there are always ‘human in the loop’ – that is sometime the problem. There will be HCI issues with making decisions quickly based on algorithms and things can go very wrong. There are false positive cases as the example of the USS Vincennes that uses DSS to make a decision on shooting down a passenger plane. The decision taking is limited by the decision shaping, which is handed more and more to algorithms. There are issues with the way military practices understand command responsibility in the Navy, which put very high standard from responsibility of failure. There is need to see how to interpret information from black boxes on false positives and false negatives. We can use this extreme example to learn about civic cases. Need to have high standards for officials. If we do visit some version of command responsibility to those who are using algorithms in governance, it is possible to put responsibility not on the user of the algorithm and not only on the creators of the code.

Dr Maria Murphy (Maynooth), ‘Algorithmic Surveillance: True
Negatives’. We all know that algorithmic interrogation of data for crime prevention is becoming commonplace and also in companies. We know that decisions can be about life and death. When considering surveillance, there are many issues. Consider the probability of assuming someone to be potential terrorist or extremist. In Human Rights we can use the concept of private life, and algorithmic processing can challenge that. Article 8 of the Human Right Convention is not absolute, and can be changed in specific cases – and the ECHR ask for justifications from governments, to show that they follow the guidelines. Surveillance regulations need to explicitly identify types of people and crimes that are open to observations. You can’t say that everyone is open to surveillance. When there are specific keywords that can be judged, but what about AI and machine learning, where the creator can’t know what will come out? There is also need to show proportionality to prevent social harm. False positives in algorithms – because terrorism are so rare, there is a lot of risk to have a bad impact on the prevention of terrorism or crime. The assumption of more data is better data, we left with a problem of generalised surveillance that is seen as highly problematic. Interestingly the ECHR do see a lot of potential in technologies and their potential use by technologies.

Professor Dag Weise Schartum (University of Oslo), ‘Transformation of Law into Algorithm’. His focus was on how algorithms are created, and thinking about this within government systems. They are the bedrock of our welfare systems – which is the way they appear in law. Algorithms are a form of decision-making: general decisions about what should be regarded, and then making decisions. The translation of decisions to computer code, but the raw material is legal decision-making process and transform them to algorithms. Programmers do have autonomy when translating requirements into code – the Norwegian experience show close work with experts to implement the code. You can think of an ideal transformation model of a system to algorithms, that exist within a domain – service or authority of a government, and done for the purpose of addressing decision-making. The process is qualification of legal sources, and interpretations that are done in natural language, which then turn into specification of rules, and then it turns into a formal language which are then used for programming and modelling it. There are iterations throughout the process, and the system is being tested, go through a process of confirming the specification and then it get into use. It’s too complex to test every aspect of it, but once the specifications are confirmed, it is used for decision-making.  In terms of research we need to understand the transformation process in different agency – overall organisation, model of system development, competences, and degree of law-making effects. The challenge is the need to reform of the system: adapting to changes in the political and social change over the time. Need to make the system flexible in the design to allow openness and not rigidness.

Heike Felzman (NUI Galway), ‘The Imputation of Mental Health
from Social Media Contributions’ philosophy and psychological background. Algorithms can access different sources – blogs, social media and this personal data are being used to analyse mood analysis, and that can lead to observations about mental health. In 2013, there are examples of identifying of affective disorders, and the research doesn’t consider the ethical implication. Data that is being used in content, individual metadata like time of online activities, length of contributions, typing speed. Also checking network characteristics and biosensing such as voice, facial expressions. Some ethical challenges include: contextual integrity (Nissenbaum 2004/2009) privacy expectations are context specific and not as constant rules. Secondly, lack of vulnerability protection – analysis of mental health breach the rights of people to protect their health. Third, potential negative consequences, with impacts on employment, insurance, etc. Finally, the irrelevance of consent – some studies included consent in the development, but what about applying it in the world. We see no informed consent, no opt-out, no content related vulnerability protections, no duty of care and risk mitigation, there is no feedback and the number of participants number is unlimited. All these are in contrast to practices in Human Subjects Research guidelines.


In terms of surveillance, we should think about self-surveillance in which the citizens are providing the details of surveillance yourself. Surveillance is not only negative – but modern approach are not only for negative reasons. There is hoarding mentality of the military-industrial complex.
The area of command responsibility received attention, with discussion of liability and different ways in which courts are treating military versus civilian responsibility.

Panel 3: Algorithmic Governance in Practice

Professor Burkhard Schafer (Edinburgh), ‘Exhibit A – Algorithms as
Evidence in Legal Fact Finding’. The discussion about legal aspects can easily go to 1066 – you can go through a whole history. There are many links to medieval law to today. As a regulatory tool, there is the issue with the rule of proof. Legal scholars don’t focus enough on the importance of evidence and how to understand it. Regulations of technology is not about the law but about the implementation on the ground, for example in the case of data protection legislations. In a recent NESTA meeting, there was a discussion about the implications of Big Data – using personal data is not the only issue. For example, citizen science project that show low exposure to emission, and therefore deciding that it’s relevant to use the location in which the citizens monitored their area as the perfect location for a polluting activity – so harming the person who collected data. This is not a case of data protection strictly. How can citizen can object to ‘computer say no’ syndrome? What are the minimum criteria to challenge such a decision? What are the procedural rules of fairness. Have a meaningful cross examination during such cases is difficult in such cases. Courts sometimes accept and happy to use computer models, and other times reluctant to take them. There are issues about the burden of proof from systems (e.g. to show that ATM was working correctly when a fraud was done). DNA tests are relying on computer modelling, but systems that are proprietary and closed. Many algorithms are hidden for business confidentiality and there are explorations of these issues. One approach is to rely on open source tools. Replication is another way of ensuring the results. Escrow ownership of model by third party is another option. Next, there is a possibility to questioning software, in natural language.

Dr Aisling de Paor (DCU), ‘Algorithmic Governance and Genetic Information’ – there is an issue in law, and massive applications in genetic information. There is rapid technological advancement in many settings, genetic testing, pharma and many other aspects – indications of behavioural traits, disability, and more. There are competing rights and interests. There are rapid advances in this area – use in health care, and the technology become cheaper (already below $1000). Genetic information. In commercial settings use in insurance, valuable for economic and efficiency in medical settings. There is also focus on personalised medicine. A lot of the concerns are about misuse of algorithms. For example, the predictive assumption about impact on behaviour and health. The current state of predictability is limited, especially the environmental impacts on expressions of genes. There is conflicting rights – efficiency and economic benefits but challenge against human rights – e.g. right to privacy . Also right for non-discrimination – making decisions on the basis of probability may be deemed as discriminatory. There are wider societal and public policy concerns – possible creation of genetic underclass and the potential of exacerbate societal stigma about disability, disease and difference. Need to identify gaps between low, policy and code, decide use, commercial interests and the potential abuses.

Anthony Behan (IBM but at a personal capacity), ‘Ad Tech, Big Data and Prediction Markets: The Value of Probability’. Thinking about advertising, it is very useful use case to consider what happen in such governance processes. What happen in 200 milliseconds for advertising, which is the standards on the internet. The process of real-time-bid is becoming standardised. Start from a click – the publisher invokes an API and give information about the interactions from the user based on their cookie and there are various IDs. Supply Side Platform open an auction. on the demand side, there are advertisers that want to push content to people – age group, demographic, day, time and objectives such as click through rates. The Demand Side platform looks at the SSPs. Each SSP is connected to hundreds of Demand Side Platforms (DSPs). Complex relationships exist between these systems. There are probability score or engage in a way that they want to engage, and they offer how much it is worth for them – all in micropayment. The data management platform (DMP) is important to improve the bidding. e.g., if they can get information about users/platform/context at specific times places etc is important to guess how people tend to behave. The economy of the internet on advert is based on this structure. We get abstractions of intent – the more privacy was invaded and understand personality and intent, the less they were interested in a specific person but more in the probability and the aggregate. Viewing people as current identity and current intent, and it’s all about mathematics – there are huge amount of transactions, and the inventory become more valuable. The interactions become more diverse with the Internet of Things. The Internet become a ‘data farm’ – we started with a concept that people are valuable, to view that data is valuable and how we can extract it from people. Advertising goes into the whole commerce element.

I’ll blog about my talk ‘Algorithmic Governance in Environmental Information (or How Technophilia Shapes Environmental Democracy) later.


There are issues with genetics and eugenics. Eugenics fell out of favour because of science issues, and the new genetics is claiming much more predictive power. In neuroscience there are issues about brain scans, which are not handled which are based on insufficient scientific evidence. There is an issue with discrimination – shouldn’t assume that it’s only negative. Need to think about unjustified discrimination. There are different semantic to the word. There are issues with institutional information infrastructure.

New Citizen Science for air quality campaign

Mapping for Change, the social enterprise that I co-founded, has been assisting community groups to run air quality studies for the past 5 years. During this period we have worked in 30 communities across London, carrying out studies with different tools – from collecting leaves, to examining lichens, to using diffusion tubes. We have also followed the development of low-costs sensors – for example, through participation in the AirProbe challenge EveryAware project or hosting a discussion about the early stages of the Air Quality Egg.

We found out that of the simple tools that are available to anyone, and that require little training, NO2 diffusion tubes are very effective. We’ve seen them used as a good sign of the level of pollution, especially from traffic. They sense pollution from diesel vehicles.

We also found that reliable equipment that can measure particulate matter known as PM2.5 (very small dust considered harmful) and other pollutants is expensive – as high as £5000 and more. Unfortunately, low-cost equipment cannot give accurate information that can be used in making a case for action.

Now, after developing the methodology for working with different groups and supporting local efforts, we are launching a crowdfunding campaign to support a large scale data collection campaign using diffusion tubes, with an aim to go beyond and create an equipment library that can be used by communities – free of charge apart from disposable parts (filters) and delivery – that can be shared across London and beyond.

With a community investment of £250 we will deliver 10 diffusion tubes and support the creation of a local NO2 map. There are other levels of support to the campaign – including sponsoring a specific piece of equipment.

Use this opportunity and organise a local air quality map for your area!