Recording of a GEO6 Webinar – data and value knowledge creation

This is the recording of a webinar that was dedicated to chapters 3 and 25 of the Global Environment Outlook. It covers different sources of data, including citizen science and indigenous knowledge.

Presented by

• James Donovan, CEO, ADEC Innovations
• Charles Mwangi, Deputy Country Coordinator for the GLOBE Program in Kenya
• Jillian Campbell, Statistician, UN Environment

Advertisements

Citizen Science 2019: Citizen Science in Action: A Tale of Four Advocates Who Would Have Lost Without You

DSC_1533.JPGJessica Culpepper (Public Justice), Larry Baldwin (Crystal Coast Waterkeeper), Matt Helper (Appalachian Voices),  Michael Krochta (Bark). 
Jessica – there can be a disconnect between the work on the ground and how it is used in advocacy. On how to use the information to make the world a better place, and hold polluters to account.
DSC_1534.JPGFirst, Michael Krochta (Bark) from Portland, OR – NGO focusing on restoring forests about Mt Hood. Doing volunteer surveys.  They carry out ground truth by volunteers to inform management but also litigation in case of logging – a project about an old growth forest that was suggested, but volunteers identify rare species habitat which stopped the logging. The Mt Hood provide drinking water, but also an area of commercial logging activities. There are programmes of logging from the forestry service – an area is going through EIA according to NEPA, and if it is suitable, it is auctioned off. The national forest management act requires them to have a forest management plan, especially concerns over spotted owl from the 1970s. At each time, there is a large area that is being analysed for exploitation, and they don’t analyse it well enough. The ground truthing is to train volunteers are checking the information and demonstrating, for example, that an area that is the map indicated as only 30-year growth is actually an old growth one. Ground truthing include taking images, checking a diameter of a tree, and assessing the canopy cover. The forest service (USFS) have limitations and they do very simplistic analysis and apply an analysis of a small area over a large area – e.g. an area of 11,742 acres that through an effort by the NGO they dropped 1531 by demonstrating that aspect and slope are greater than 30%. There is a requirement to use more complex equipment.
The forest service is describing “desired future conditions” and demonstrating that the conditions are already there. Another evidence is “survey and manage” – the forest service require to survey and manage trees that are over 80 years old. There is an example of the Red Tree Voles (which the Spotted Owl) and because it’s hard to find the next of the voles, they don’t climb trees – once people are trained to climb Douglas Fir, they can collect evidence – the forest service is doing only ground-based surveys. A detailed map of the area helps in removing places that are within a radius from identified nests. There are also protected plant species that they identified by volunteers. Existing legal hook – National Forest Management Act on land allocation and current ecological conditions, NEPA in terms of baseline conditions and cumulative impacts, Endangered Species Act, “Survey and manage” from Northwest Forest Plan. bark-out.org
DSC_1535.JPG
Volunteers demonstrate misclassification of an old growth area
DSC_1536.JPG
Surveying Red Tree Voles nests

DSC_1538.JPGMatt Hepler – Appalachian Voices – part of Appalachian Citizens Enforcement alliance, is doing engagement with people about the Clean Water Act to monitor their watershed and bringing local knowledge to the front. People feel disempowered and don’t interact with state agencies – gave up hope or don’t know how. Holding state agencies and coal companies accountable. The sites that they are researching are hot spot – word of mouth on local knowledge, use of Google Maps and Google Earth and also use QGIS, and they look at Discharge Monitoring Reports – the mines are supposed to produce DMRs for each stream, and these can be examined and can also grab location so they can carry out their own analysis. Spending as much time analysing the maps to decide where to take samples as much as doing in the fields. Mapping is important – but not every community members are not good with computers or explaining how to use GPS and coordinates. The maps are important for not trespassing so to find places that it is possible to properly sample. There can be intervening sources that can impact the sampling site. They are using equipment in a library – using a pH buffer bottle, using instruments and people monitor pH, temperature, Total Dissolved Solids (TDS) and Conductivity. If there is low pH or high conductivity they do further tests for heavy metals and sulfates and lab methods. It’s important to have QA – training on how to calibrate, how to not trespass and upload the data. There is limited editing access for data so it can be controlled. Calibration of probes before landing them. Using Virginia Tier II water quality data standards – checking that it’s good enough for state-level monitoring and evidence. There is also polaroid justice – provide photographic evidence for the work that they do so it can be submitted “Polaroid Justice”. They have a website http://www.act-project.org and now considering replacing it with smartphones with EpiCollect and ArcGIS Online, as it allows offline data collection. ArcGIS online can pull data from the EPA, state agencies and other sources and that are useful. Some successes – in specific streams (Kelly Branch and Penn Virginia) for illegal discharge of selenium and that led to Supplemental Environmental Project that bring money to remedy, reporting water quality violations, also found abandoned mines locations, and increased knowledge and awareness. Data have been used by academics who are interested in water quality in Appalachia.

DSC_1541.JPGLarry Baldwin – talking from multiple organisations that he involved in: crystal coast waterkeeper and coastal Carolina riverkeeper. The issue is Coal Ash and CAFO – the residue for coal that is used in power plants, and CAFO is concentrated Animal Feeding Operations from pork and poultry (turkey and chicken) because of industrial farming. They got information from a farmer about coal ash spill in the Dan River and took to the air, showing a spill from coal and CAFO sources. They had volunteers who recognise the discharge and people took photos for weeks. There are quite a few sites like that. The issue with CAFO that come from factory farms that got a “lagoon” which is a cesspit – a hole in the ground that include the sewage from the swine and then sprayed on the ground as a “fertilisers”. There are issues of discharge from CAFO – you find it out from neighbours who are checking the information, Trespassing is an issue, and they allow the organisation to go and sample. There are big mountains of poultry waste – with nitrate, bacteria and all sort of other things in it. There are 2400 swine “lagoon” mostly near low-income communities and black and Hispanic communities. So they provided tools to allow communities members to collect evidence from aerial monitoring with volunteer pilots – who have their own aeroplane who are willing to fly over the property, with attempts not to allow flying a drone over a facility because that is not allowed by law. After hurricane Florence, when it hit on Saturday, they flew for 8 days, to document the impact of the storm. Used a sign on the board of the local airport and recruiting pilots this way (covering the fuels). Also doing campaigns which get people involved – including billboards. The industry got upset about the billboards that they put their own campaign. Use an innovative way to engage people – they pay for themselves in terms of participation. Going to lawsuits only as last options – using clean air act or legislative actions to campaign and change things. Lobbying, campaigning, the court of public opinion is also important – using the information from volunteers to put it in front of the public, conventional media (print/radio/TV), documentaries – bringing people from Russia, China and other countries to avoid the problem in their own country, and finally social media. Training people to take samples and teaching people to use equipment to prove the point in a specific issue. If it is not part of the volunteers who step up to be part of the solution.
DSC_1546.JPGJessica Culpepper  – Public Justice is a national advocacy organisation and they have lawyers and been doing it for 7 years. There are environmental lawsuits that are based on citizen science and it is important to use it in these cases. There are also gag laws that are being put even to block access to public land (the Wyoming law). These laws are there to stop citizen scientists to identify problems. Public justice is to identify the problems in the energy and agricultural sector – coal ash, water. The Food Project try to support dismantling industrial agriculture towards a regenerative form of animal agriculture. Believe in deep partnership with communities and representing farmers, rural communities, consumers, and workers. Focusing on communities that don’t have clean water because of nitrates. Poultry has issues of working rights and other issues. The Burton et al v Mountaire Poultry – in a Milsbrough they experience water pollution that a community of colour was exposed to without knowing. There is row poultry waste sprayed on the field, and when the incident happened, the environment agency sample 11 wells and just sent water softener without explanation how it will help the situation. A group at the Sussex County Del. , with a group keep our wells safe, and explain to community members that their water is not space, and stepping up is very scary – losing a job, excluded from a local church, children being bullied etc. There is a disposal field not far away from the community. There was a child who died from asthma, limb loss for diabetic patients – all associate from nitrate. They start by community well sampling project -and went door to door to do onsite nitrate and discovered that a lot of wells are contaminated. Used Google Earth to map Nitrate and also got evidence through freedom of information. As a lawyer, she can demonstrate that it is a facility that can be blamed It is possible to demonstrate the link – without citizen science and community science that enabled data collection. They also show that the trend is going up since the farm happened. The chart was created by one of the citizen scientists in the community. The data enabled to collaboratively create a groundwater flow map through a hydrogeologist – and they could prove that could bring a lawsuit on behalf the community – and there was a question of what they want to get out of it. They also did media blitz in USA Today and asked why senators don’t show up  in the communities, and that influenced the advocacy – it led to the America Water Infrastructure Act of 2018 to get a grant to monitor and if the polluter is identified, they need to cover the costs – that despite the link between Tom Carper link to Poultry industry in Delaware. You need a positive vision, show up and document, willingness to be out in the media by the community, work with a wider network – work of citizen scientists is amazing. Burnout is real, and you need to work with different groups – an effort by communities and fighting for 25-30 years, and there is a personal price that they pay, with threatening family members.
DSC_1547.JPG
Mapping with tools such as Google Earth is valuable in EJ legal cases as it shows the vicinity of pollution sites to houses
DSC_1549.JPG
Analysis by a community member provides evidence linking the development of the facility with pollution

Citizen Science 2019: Environmental Justice and Community Science: A Social Movement for Inpowerment, Compliance, and Action

DSCN3340The session was opened by Na’Taki Osborne-Jelks, Agnes Scott College (CSA board) – the environmental justice movement have used methods of community science we need to include in the tent of citizen science. There are 60 participants in the conference that are supported by the NSF to participate in the conference. There was a special effort to ensure that Environmental Justice is represented in the conference.

Ellen McCallie (NSF), which provided a grant to support EJ activists to join the conference, noted that the NSF Includes got a specific focus on those that are under-represented in STEM and that are underserved by NSF projects. There are about 150 projects by NSF that include citizen science and crowdsourcing, and all of them push boundaries in knowledge or help people to learn about science.

The panel was moderated by Sacoby M. Wilson, Community Engagement, Environmental Justice, and Health (CEEJH), University of Maryland-College Park. The chair set three questions:

First question: how you got into citizen science/community science?

Second question: what were some successes?

Third question: what your message to the CSA?

Panellists:

Viola “Vi” Waghiyi, Alaska Community Action on Toxics (ACAT), St. Lawrence Island, Alaska.

Located in the north Baring Sea, shes have 4 boys and the community. They are close to Siberia, and the Air Force established two bases in the Cold War. The people in the area continue to leave in the land and they wanted to keep the way of life and not separate themselves from the land and sea. It’s an Island the size of Puerto Rico, but TB, starvation and other impact reduce them to 1500 people. The bases established at each end and stay there from 1940 to 1970, and the contamination impacted cancer and health defect. They were ignored about the impacts and pleaded to help. An executive who was a scientist and they started a community based participatory research and they know that they have a higher PCB and one of the most contaminated community because they rely on traditional food – chemical releases end in their environment without chemical factories. They have a crisis in their community. She took a position to learn about chemicals and the impact on her people and been doing it for 17 years – taking samples, doing research, train local people.

Success – the institutional barriers that a small non-profit has challenges in addressing the PCB and the state is pro=developement of energy sources. So the state agencies don’t look after marginalised communities. There are also issues of funding, with a refusal of funding as their expertise are not valuable. The success – there are so many chemicals that are being created and all that impact you and your body. Companies that don’t take human health into account. The indigenous group is part of the human right convention and trained to use their voice to influence the process – work at the international level helps everyone.

Traditional knowledge, song, dances, creation stories, and we need to have sound data that scientists need to use to help communities in health and disparities.

Margaret L Gordon, West Oakland Environmental Indicators Project, Oakland, CA – dealing with dirty diesel project. Connected her community to improve the air near the port for over 25 years

How got to the field? Got involved in citizen science because she got tired of the state agencies and local agencies and lack of response. The organise to demonstrate that the city, the county, to EPA to demonstrate that they can collect information and measure their own air quality. They started in 2008 in Oakland and Berkeley, and researcher came to them. They started to use dust measurement, and a community measurement technician and really understood how to use the equipment and keep it accurate.

Success – part of creating an equitable solution, and problem-solving mechanisms to solve the issues. An understanding of problem-solving and bring people from the city, but need an equitable process and she was also the board of the port of Oakland and that was useful to address issues. Some people in citizen science, who didn’t learn how to be community engaged should not come to communities – they had to teach researchers how to work with them, and there are also issues with universities who want to collaborate and don’t share funding with community organisations. Relationship of trust and good communication can work.

We need cumulative impacts that need to be carried out in impacted communities and there is not enough academic research in the communities that are exposed to pollution. Better impact science.

Question about Climate change: we need to talk about Climate justice, and that need to be discussed about the impact on poor communities to deal with floods, and other impacts.

Omega R. Wilson, West End Revitalization Association (WERA), Mebane, NC – doing a Community Owned and Managed Research – the gold standard for community science.

EJ movement and activity started 70 years ago (he is 69), before they were bord – it was passed from their mothers. Issues of toxic free soil, good water, good air – there is a continuum. Moved after university to the Mississippi and in NC develop a new understanding of EJ issues and with the support of NIH helped to develop research in the area of North Carolina.

Successes – community groups deserve recognition in books and publications. There were intimidations of family members of activists by state officials. The use of the law is a way to get things working and to achieve.

The Citizen Science Association should be about dealing with problems, not just studying them. Push universities to actually fund pollutants use – the CSA should encourage growing education of Hispanic, Black and Indigenous groups education in science. The association need to support where there are getting the resources. Science for people, as science for action.

The issues are about terminology and changing citizen science and use community-based science and community-based research: everyone has a right to clean water.

Vincent Martin, Community Organizer Against Petroleum Refineries, Detroit, MI – push issues of air quality around Detroit and active at the national level. Got his company to assist the community with EJ issues.

The basic right for air, water, and climate change will get worse in poor communities. His community got coals, roads and highways, and a lot of hazardous material is released to their community.  When they started all the “white crosses” on a map of each person that died from an environmental related disease was unbearable and they had to stop. Experienced that with a brother who died from that impact. There was a proposal to bring Tar Sands for processing to their area, and the pointed that the zoning laws are incorect, and that was ignored – but then when the authorities check, they show that this was correct but the city authorities approved the expansion. Started to learn about toxics and about issues and how communities are being treated in such a situation. The community need to provide oversight and “hey, we don’t want that” and get some transparency and equity.

Beverly Wright, Deep South Center for Environmental Justice and Dillard University, New Orleans, LA – influenced at national policy and influenced EJ issues nationally.

Got her PhD in Buffalo, as a sociologist working with the trauma of the love canal and the impact on the community. In the Missisipy Industrial Corridor they could see the chemical impact on the community and while people could see the evidence on fish, insect, and because there was only one chemical at the time, they couldn’t show link. In a community that she worked with, they took their own sample. Fell into citizen science through “we don’t trust you” and recruited toxicologist, and set out environmental sociology to work with a community. They create the first GIS map that shows the spatial distribution by race and income to TRI facilities and there were clusters of black communities.

Success – one of the only PhDs that are not being kicked out of community meetings. They made a community university model in 1992 and they use that model for a Community University Partnership by the EPA. Louisiana there were issues of working with communities – most environmental organisations that are typical (white, middle class – Big Green) bring students from the outside who then go away and don’t leave anything behind. So brought researchers to teach communities how to use the processes and collect data – and that is the creation of the Bucket Brigade. The White Crosses were used to demonstrate strange cancer rates in the chemical corridors. It took 18 years to win a case, but with the effort of the bucket brigade effort and capturing white steam that goes through the community and it was sent to EPA. Once it was captured, the EPA change the approach and organise the community in Diamond Plantation who got funding for relocation.

The level of pollution that is allowed by EPA – permits are set by the first company that was allowed to pollute, and the licences are about poisoning people, in effect.

Science not leading to action – most of the time. Need political science: science and advocacy.

There is an internalised racism and that is real and black people who are working for everybody, and there is an issue that someone is speaking for them. The black people are the only group that was enslaved by this country and that is persistent even in EJ, and other ethnic groups are not supporting black group – e.g. Latinos, Native American etc. It is an issue of racism that carried over to other minorities group. But black people learned to stand for themselves.

Climate change: the EJ movement pushing that the Green New Deal includes justice element and equity, and not to allow carbon trading that will leave pollution to poor communities. Need to think about how to have a just transition to a green economy. That is an effort towards the election of 2020.

Carmen Velez Vega, PhD, MSW, Tenured Full Professor, University of Puerto Rico – Medical Sciences Campus – addressing public health issues, and involved in the recovery of Porto Rico after Hurrican Maria.

Became involved in EJ because before that she was activists in the LGBT: e.g. the same-sex adoption, and that experience opens up other experiences. Puerto Rico is an Environmental Injustice Island – one phenomenon is the same people fighting on everything. As a social worker started to learn and in the school of public health. She was involved in a project that was funded by the NIH and looked at someone to do community engagement with a known researcher, and use the text of Phil Brown and through that, she was exposed to the risk that women in reproductive age are exposed to. There is an issue of contaminated water and toxic products. She learned that not all women are exposed equally – the more poor and brown you are, the more exposed you are. After Hurricane Maria, they were abandoned by the authorities and that added to the injustice. The injustices would not disappear.

The CSA should promote policies that push towards environmental justice and impact at a larger scale. Promoting young people and leaders in the area of environmental justice. Need to work with the communities.

 

 

From environmental management to organisational strategy development: Using Drivers-Pressure-State-Impact-Response with ECSA

This week, together with Margaret Gold, I facilitated a strategy meeting of the European Citizen Science Association.31520287784_20489a734e At the moment, because a recent lecture in the Introduction to Citizen Science and Scientific Crowdsourcing course that was dedicated to environmental citizen science, the “Driving forces-Pressures-State-Impacts -Responses” (DPSIR) is in the front of my mind. In addition, next week I’ll participate in a workshop about Long-Term Socio-Ecological Research (LTSER) where I would discuss citizen science in another context where DPSIR is a common framework.

However, if you are not familiar with large-scale environmental management, where it is widely used since the mid-1990s,  you’re not expected to know about it. It got its critics, but continue to be considered as an important policy tool. DPSIR start by thinking about driving forces – trends or mega-trends that are influencing the ecosystem that you’re looking at. The drivers lead to specific pressures, for example, pollution or habitat fragmentation. To understand the pressures, we need to monitor and understand the state of the system – this is lots of time where citizen science and sensing data are used. Next, we can understand the potential impacts and then think of policy responses. So far, hopefully clear? You can read more about DPSIR here.

I haven’t come across the use of DPSIR outside the environmental area (but maybe there is?). However, as I was thinking about it, as we prepared for the meeting, I suggested that we give it a go as a way to consider strategic actions and work for ECSA. It turns out that DPSIR is a very good tool for organisational development! It allowed us to have a 20 minutes session in which we could think about external trends, and then translate them into a concrete action. Here is an example (made up, of course, I can’t disclose details from a facilitated meeting…). I’m marking positive things, from the point of view of the organisation, as (+) and negative as (-).

Let’s think of a citizen science coordination society (CitScCoSo). in terms of drivers, an example will be “increase recognition of citizen science”, as Google Trends chart shows. Next, there are the pressures which include (-) the growth in other organisations that are dedicated to citizen science and compete with CitScCoSo, which mean that it will need to work harder to maintain its position, (+) increase in requests to participate in activities, projects, meetings, talks etc which will create opportunity to raise profile and recognition. CitScCoSo current state can be that the organisation is funded for 5 more years and have a little spare capacity for other activities. The impacts can be (+) more opportunities for research funding and collaborations or, (-) demand for more office space for CitScCoSo (-) lack of IT infrastructure for internal organisational processes. Finally, all this analysis can help CitScCoSo in response – securing funding for more employees or a plan for growth.

When you do that on a flipchart with 5 columns for the DPSIR element, it becomes a rapid and creative process for people to work through.

As I pointed, a short exercise with ECSA board showed that this can work, and I hope that the outcomes are helpful to the organisation. I will be interested to hear if anyone else know of alternative applications of DPSIR…

 

The Participatory City & Participatory Sensing – new paper

The Participatory City is a new book, edited by Yasminah Beebeejaun The Participatory City cover, which came out in March and will be launched on the 1st June. The book gather 19 chapters that explore the concept of participation in cities of all shapes and sizes. As Yasminah notes, concern about participation has started in the 1960s and never gone from urban studies – be it in anthropology, geography, urban planning, history or sociology.

The book is structured around short chapters of about eight pages, with colour images that illustrate the topic of the chapter. This make the book very accessible – and suitable for reading while commuting in a city. The chapters take you for a tour around many places in the world: from London, Berlin, Bangalore, to Johannesburg, Mexico City and to small towns in Pennsylvania and Lancashire (and few other places). It also explores multiple scales – from participation in global negotiations about urban policy in the UN, to the way immigrants negotiate a small area in central Dublin, as well as discussion of master-planning in several places, including London and Mexico City.

The book demonstrate the multi-faceted aspects of participation: from political power, to gender, environmental justice, indigenous rights, skills, expertise and the use of scientific information for decision making. Each of the chapters provides a concrete example for the participatory issue that it covers, and by so doing, make the concept that is being addressed easy to understand.

Not surprisingly, many of the success stories in the book’s chapters are minor, temporary and contingent on a set of conditions that allow them to happen. Together, the chapters demonstrate that participation, and the demand for representation and rights to the city are not futile effort but that it is possible to change things.

With a price tag that is reasonable, though not cheap (€28, about £21), this is highly recommended book that charts the aspects of urban participation in the early part of the 21st century, and especially demonstrating the challenges for meaningful participation in the face of technological developments, social and economic inequalities, and governance approaches that emphasise markets over other values.

My contribution to the book is titled ‘Making Participatory Sensing Meaningful and I’m examining how the concept of participatory sensing mutated over the years to mean any form of crowdsourced sensing. I then look at our experience in participatory sensing in Heathrow to suggest what are the conditions that enable participatory sensing that is matching the expectations from participatory processes, as well as the limitations and challenges. You can  find the paper here  and the proper citation for it is:

Haklay, M., 2016, Making Participatory Sensing Meaningful, in Beebeejaun, Y. (Ed.) The Participatory City, Jovis, pp. 154-161.

 

Algorithmic governance in environmental information (or how technophilia shape environmental democracy)

These are the slides from my talk at the Algorithmic Governance workshop (for which there are lengthy notes in the previous post). The workshop explored the many ethical, legal and conceptual issues with the transition to Big Data and algorithm based decision-making.

My contribution to the discussion is based on previous thoughts on environmental information and public use of it. Inherently, I see the relationships between environmental decision-making, information, and information systems as something that need to be examined through the prism of the long history that linked them. This way we can make sense of the current trends. This three area are deeply linked throughout the history of the modern environmental movement since the 1960s (hence the Apollo 8 earth image at the beginning),  and the Christmas message from the team with the reference to Genesis (see below) helped in making the message stronger .

To demonstrate the way this triplet evolved, I’m using texts from official documents – Stockholm 1972 declaration, Rio 1992 Agenda 21, etc. They are fairly consistent in their belief in the power of information systems in solving environmental challenges. The core aspects of environmental technophilia are summarised in slide 10.

This leads to environmental democracy principles (slide 11) and the assumptions behind them (slide 12). While information is open, it doesn’t mean that it’s useful or accessible to members of the public. This was true when raw air monitoring observations were released as open data in 1997 (before anyone knew the term), and although we have better tools (e.g. Google Earth) there are consistent challenges in making information meaningful – what do you do with Environment Agency DSM if you don’t know what it is or how to use a GIS? How do you interpret Global Forest Watch analysis about change in tree cover in your area if you are not used to interpreting remote sensing data (a big data analysis and algorithmic governance example)? I therefore return to the hierarchy of technical knowledge and ability to use information (in slide 20) that I covered in the ‘Neogeography and the delusion of democratisation‘ and look at how the opportunities and barriers changed over the years in slide 21.

The last slides show that despite of all the technical advancement, we can have situations such as the water contamination in Flint, Michigan which demonstrate that some of the problems from the 1960s that were supposed to be solved, well monitored, with clear regulations and processes came back because of negligence and lack of appropriate governance. This is not going to be solved with information systems, although citizen science have a role to play to deal with the governmental failure. This whole sorry mess and the re-emergence of air quality as a Western world environmental problem is a topic for another discussion…

Algorithmic Governance Workshop (NUI Galway)

Algorithmic Governance Workshop (source: Niall O Brolchain)

The workshop ‘Algorithmic Governance’ was organised as an intensive one day discussion and research needs development. As the organisers Dr John Danaher
and Dr Rónán Kennedy identified:

‘The past decade has seen an explosion in big data analytics and the use  of algorithm-based systems to assist, supplement, or replace human decision-making. This is true in private industry and in public governance. It includes, for example, the use of algorithms in healthcare policy and treatment, in identifying potential tax cheats, and in stopping terrorist plotters. Such systems are attractive in light of the increasing complexity and interconnectedness of society; the general ubiquity and efficiency of ‘smart’ technology, sometimes known as the ‘Internet of Things’; and the cutbacks to government services post-2008.
This trend towards algorithmic governance poses a number of unique challenges to effective and legitimate public-bureaucratic decision-making. Although many are already concerned about the threat to privacy, there is more at stake in the rise of algorithmic governance than this right alone. Algorithms are step-by-step computer coded instructions for taking some input (e.g. tax return/financial data), processing it, and converting it into an output (e.g. recommendation for audit). When algorithms are used to supplement or replace public decision-making, political values and policies have to be translated into computer code. The coders and designers are given a set of instructions (a project ‘spec’) to guide them in this process, but such project specs are often vague and underspecified. Programmers exercise considerable autonomy when translating these requirements into code. The difficulty is that most programmers are unaware of the values and biases that can feed into this process and fail to consider how those values and biases can manifest themselves in practice, invisibly undermining fundamental rights. This is compounded by the fact that ethics and law are not part of the training of most programmers. Indeed, many view the technology as a value-neutral tool. They consequently ignore the ethical ‘gap’ between policy and code. This workshop will bring together an interdisciplinary group of scholars and experts to address the ethical gap between policy and code.

The workshop was structured around 3 sessions of short presentations of about 12 minutes, with an immediate discussion, and then a workshop to develop research ideas emerging from the sessions. This very long post are my notes from the meeting. These are my takes, not necessarily those of the presenters. For another summery of the day, check John Danaher’s blog post.

Session 1: Perspective on Algorithmic Governance

Professor Willie Golden (NUI Galway)Algorithmic governance: Old or New Problem?’ focused on an information science perspective.  We need to consider the history – an RO Mason paper from 1971 already questioned the balance between the decision-making that should be done by humans, and that part that need to be done by the system. The issue is the level of assumptions that are being integrated into the information system. Today the amount of data that is being collected and the assumption on what it does in the world is a growing one, but we need to remain sceptical at the value of the actionable information. Algorithms needs managers too. Davenport in HBR 2013 pointed that the questions by decision makers before and after the processing are critical to effective use of data analysis systems. In addition, people are very concerned about data – we’re complicit in handing over a lot of data as consumers and the Internet of Things (IoT) will reveal much more. Debra Estrin 2014 at CACM provided a viewpoint – small data, where n = me where she highlighted the importance of health information that the monitoring of personal information can provide baseline on you. However, this information can be handed over to health insurance companies and the question is what control you have over it. Another aspect is Artificial Intelligence – Turing in 1950’s brought the famous ‘Turing test’ to test for AI. In the past 3-4 years, it became much more visible. The difference is that AI learn, which bring the question how you can monitor a thing that learn and change over time get better. AI doesn’t have self-awareness as Davenport 2015 noted in Just How Smart are Smart Machines and arguments that machine can be more accurate than humans in analysing images. We may need to be more proactive than we used to be.

Dr Kalpana Shankar (UCD), ‘Algorithmic Governance – and the
Death of Governance?’ focused on digital curation/data sustainability and implication for governance. We invest in data curation as a socio-technical practice, but need to explore what it does and how effective are current practices. What are the implications if we don’t do ‘data labour’ to maintain it, to avoid ‘data tumbleweed. We are selecting data sets and preserving them for the short and long term. There is an assumption that ‘data is there’ and that it doesn’t need special attention. Choices that people make to preserve data sets will influence the patterns of  what appear later and directions of research. Downstream, there are all sort of business arrangement to make data available and the preserving of data – the decisions shape disciplines and discourses around it – for example, preserving census data influenced many of the social sciences and direct them towards certain types of questions. Data archives influenced the social science disciplines – e.g. using large data set and dismissing ethnographic and quantitative data. The governance of data institutions need to get into and how that influence that information that is stored and share. What is the role of curating data when data become open is another question. Example for the complexity is provided in a study of a system for ‘match making’ of refugees to mentors which is used by an NGO, when the system is from 2006, and the update of job classification is from 2011, but the organisation that use the system cannot afford updating and there is impacts on those who are influenced by the system.

Professor John Morison (QUB), ‘Algorithmic Governmentality’. From law perspective, there is an issue of techno-optimism. He is interested in e-participation and participation in government. There are issue of open and big data, where we are given a vision of open and accountable government and growth in democratisation – e.g. social media revolution, or opening government through data. We see fantasy of abundance, and there are also new feedback loops – technological solutionism to problems in politics with technical fixes. Simplistic solutions to complex issues. For example, an expectation that in research into cybersecurity, there are expectations of creating code as a scholarly output. Big Data have different creators (from Google to national security bodies) and they don’t have the same goals. There is also issues of technological authoritarianism as a tool of control. Algorithmic governance require to engage in epistemology, ontology or governance. We need to consider the impact of democracy – the AI approach is arguing for the democratisation through N=all argument. Leaving aside the ability to ingest all the data, what is seemed to assume that subjects are not viewed any more as individuals but as aggregate that can be manipulated and act upon. Algorithmic governance, there is a false emancipation by promise of inclusiveness, but instead it is responding to predictions that are created from data analysis. The analysis is arguing to be scientific way to respond to social needs. Ideas of individual agency disappear. Here we can use Foucault analysis of power to understand agency.  Finally we also see government without politics – arguing that we make subjects and objects amenable to action. There is not selfness, but just a group prediction. This transcend and obviates many aspects of citizenship.

Niall O’Brolchain (Insight Centre), ‘The Open Government’. There is difference between government and governance. The eGov unit in Galway Insight Centre of Data Analytics act as an Open Data Institute node and part of the Open Government Partnership. OGP involve 66 countries, to promote transparency, empower citizens, fight corruption, harness new technologies to strengthen governance. Started in 2011 and involved now 1500 people, with ministerial level involvement. The OGP got set of principles, with eligibility criteria that involve civic society and government in equal terms – the aim is to provide information so it increase civic participation, requires the highest standards of professional integrity throughout administration, and there is a need to increase access to new technologies for openness and accountability. Generally consider that technology benefits outweigh the disadvantages for citizenship. Grand challenges – improving public services, increasing public integrity, public resources, safer communities, corporate accountability. Not surprisingly, corporate accountability is one of the weakest.

Discussion:

Using the Foucault framework, the question is about the potential for resistance that is created because of the power increase. There are cases to discuss about hacktivism and use of technologies. There is an issue of the ability of resisting power – e.g. passing details between companies based on prediction. The issue is not about who use the data and how they control it. Sometime need to use approaches that are being used by illegal actors to hide their tracks to resist it.
A challenge to the workshop is that the area is so wide, and we need to focus on specific aspects – e.g. use of systems in governments, and while technology is changing. Interoperability.  There are overlaps between environmental democracy and open data, with many similar actors – and with much more government buy-in from government and officials. There was also technological change that make it easier for government (e.g. Mexico releasing environmental data under OGP).
Sovereignty is also an issue – with loss of it to technology and corporations over the last years, and indeed the corporate accountability is noted in the OGP framework as one that need more attention.
There is also an issue about information that is not allowed to exists, absences and silences are important. There are issues of consent – the network effects prevent options of consent, and therefore society and academics can force businesses to behave socially in a specific way. Keeping of information and attributing it to individuals is the crux of the matter and where governance should come in. You have to communicate over the internet about who you are, but that doesn’t mean that we can’t dictate to corporations what they are allowed to do and how to use it. We can also consider of privacy by design.

Session 2: Algorithmic Governance and the State

Dr Brendan Flynn (NUI Galway), ‘When Big Data Meets Artificial Intelligence will Governance by Algorithm be More or Less Likely to Go to War?’. When looking at autonomous weapons we can learn about general algorithmic governance. Algorithmic decision support systems have a role to play in very narrow scope – to do what the stock market do – identifying very dangerous response quickly and stop them. In terms of politics – many things will continue. One thing that come from military systems is that there are always ‘human in the loop’ – that is sometime the problem. There will be HCI issues with making decisions quickly based on algorithms and things can go very wrong. There are false positive cases as the example of the USS Vincennes that uses DSS to make a decision on shooting down a passenger plane. The decision taking is limited by the decision shaping, which is handed more and more to algorithms. There are issues with the way military practices understand command responsibility in the Navy, which put very high standard from responsibility of failure. There is need to see how to interpret information from black boxes on false positives and false negatives. We can use this extreme example to learn about civic cases. Need to have high standards for officials. If we do visit some version of command responsibility to those who are using algorithms in governance, it is possible to put responsibility not on the user of the algorithm and not only on the creators of the code.

Dr Maria Murphy (Maynooth), ‘Algorithmic Surveillance: True
Negatives’. We all know that algorithmic interrogation of data for crime prevention is becoming commonplace and also in companies. We know that decisions can be about life and death. When considering surveillance, there are many issues. Consider the probability of assuming someone to be potential terrorist or extremist. In Human Rights we can use the concept of private life, and algorithmic processing can challenge that. Article 8 of the Human Right Convention is not absolute, and can be changed in specific cases – and the ECHR ask for justifications from governments, to show that they follow the guidelines. Surveillance regulations need to explicitly identify types of people and crimes that are open to observations. You can’t say that everyone is open to surveillance. When there are specific keywords that can be judged, but what about AI and machine learning, where the creator can’t know what will come out? There is also need to show proportionality to prevent social harm. False positives in algorithms – because terrorism are so rare, there is a lot of risk to have a bad impact on the prevention of terrorism or crime. The assumption of more data is better data, we left with a problem of generalised surveillance that is seen as highly problematic. Interestingly the ECHR do see a lot of potential in technologies and their potential use by technologies.

Professor Dag Weise Schartum (University of Oslo), ‘Transformation of Law into Algorithm’. His focus was on how algorithms are created, and thinking about this within government systems. They are the bedrock of our welfare systems – which is the way they appear in law. Algorithms are a form of decision-making: general decisions about what should be regarded, and then making decisions. The translation of decisions to computer code, but the raw material is legal decision-making process and transform them to algorithms. Programmers do have autonomy when translating requirements into code – the Norwegian experience show close work with experts to implement the code. You can think of an ideal transformation model of a system to algorithms, that exist within a domain – service or authority of a government, and done for the purpose of addressing decision-making. The process is qualification of legal sources, and interpretations that are done in natural language, which then turn into specification of rules, and then it turns into a formal language which are then used for programming and modelling it. There are iterations throughout the process, and the system is being tested, go through a process of confirming the specification and then it get into use. It’s too complex to test every aspect of it, but once the specifications are confirmed, it is used for decision-making.  In terms of research we need to understand the transformation process in different agency – overall organisation, model of system development, competences, and degree of law-making effects. The challenge is the need to reform of the system: adapting to changes in the political and social change over the time. Need to make the system flexible in the design to allow openness and not rigidness.

Heike Felzman (NUI Galway), ‘The Imputation of Mental Health
from Social Media Contributions’ philosophy and psychological background. Algorithms can access different sources – blogs, social media and this personal data are being used to analyse mood analysis, and that can lead to observations about mental health. In 2013, there are examples of identifying of affective disorders, and the research doesn’t consider the ethical implication. Data that is being used in content, individual metadata like time of online activities, length of contributions, typing speed. Also checking network characteristics and biosensing such as voice, facial expressions. Some ethical challenges include: contextual integrity (Nissenbaum 2004/2009) privacy expectations are context specific and not as constant rules. Secondly, lack of vulnerability protection – analysis of mental health breach the rights of people to protect their health. Third, potential negative consequences, with impacts on employment, insurance, etc. Finally, the irrelevance of consent – some studies included consent in the development, but what about applying it in the world. We see no informed consent, no opt-out, no content related vulnerability protections, no duty of care and risk mitigation, there is no feedback and the number of participants number is unlimited. All these are in contrast to practices in Human Subjects Research guidelines.

Discussion:

In terms of surveillance, we should think about self-surveillance in which the citizens are providing the details of surveillance yourself. Surveillance is not only negative – but modern approach are not only for negative reasons. There is hoarding mentality of the military-industrial complex.
The area of command responsibility received attention, with discussion of liability and different ways in which courts are treating military versus civilian responsibility.

Panel 3: Algorithmic Governance in Practice

Professor Burkhard Schafer (Edinburgh), ‘Exhibit A – Algorithms as
Evidence in Legal Fact Finding’. The discussion about legal aspects can easily go to 1066 – you can go through a whole history. There are many links to medieval law to today. As a regulatory tool, there is the issue with the rule of proof. Legal scholars don’t focus enough on the importance of evidence and how to understand it. Regulations of technology is not about the law but about the implementation on the ground, for example in the case of data protection legislations. In a recent NESTA meeting, there was a discussion about the implications of Big Data – using personal data is not the only issue. For example, citizen science project that show low exposure to emission, and therefore deciding that it’s relevant to use the location in which the citizens monitored their area as the perfect location for a polluting activity – so harming the person who collected data. This is not a case of data protection strictly. How can citizen can object to ‘computer say no’ syndrome? What are the minimum criteria to challenge such a decision? What are the procedural rules of fairness. Have a meaningful cross examination during such cases is difficult in such cases. Courts sometimes accept and happy to use computer models, and other times reluctant to take them. There are issues about the burden of proof from systems (e.g. to show that ATM was working correctly when a fraud was done). DNA tests are relying on computer modelling, but systems that are proprietary and closed. Many algorithms are hidden for business confidentiality and there are explorations of these issues. One approach is to rely on open source tools. Replication is another way of ensuring the results. Escrow ownership of model by third party is another option. Next, there is a possibility to questioning software, in natural language.

Dr Aisling de Paor (DCU), ‘Algorithmic Governance and Genetic Information’ – there is an issue in law, and massive applications in genetic information. There is rapid technological advancement in many settings, genetic testing, pharma and many other aspects – indications of behavioural traits, disability, and more. There are competing rights and interests. There are rapid advances in this area – use in health care, and the technology become cheaper (already below $1000). Genetic information. In commercial settings use in insurance, valuable for economic and efficiency in medical settings. There is also focus on personalised medicine. A lot of the concerns are about misuse of algorithms. For example, the predictive assumption about impact on behaviour and health. The current state of predictability is limited, especially the environmental impacts on expressions of genes. There is conflicting rights – efficiency and economic benefits but challenge against human rights – e.g. right to privacy . Also right for non-discrimination – making decisions on the basis of probability may be deemed as discriminatory. There are wider societal and public policy concerns – possible creation of genetic underclass and the potential of exacerbate societal stigma about disability, disease and difference. Need to identify gaps between low, policy and code, decide use, commercial interests and the potential abuses.

Anthony Behan (IBM but at a personal capacity), ‘Ad Tech, Big Data and Prediction Markets: The Value of Probability’. Thinking about advertising, it is very useful use case to consider what happen in such governance processes. What happen in 200 milliseconds for advertising, which is the standards on the internet. The process of real-time-bid is becoming standardised. Start from a click – the publisher invokes an API and give information about the interactions from the user based on their cookie and there are various IDs. Supply Side Platform open an auction. on the demand side, there are advertisers that want to push content to people – age group, demographic, day, time and objectives such as click through rates. The Demand Side platform looks at the SSPs. Each SSP is connected to hundreds of Demand Side Platforms (DSPs). Complex relationships exist between these systems. There are probability score or engage in a way that they want to engage, and they offer how much it is worth for them – all in micropayment. The data management platform (DMP) is important to improve the bidding. e.g., if they can get information about users/platform/context at specific times places etc is important to guess how people tend to behave. The economy of the internet on advert is based on this structure. We get abstractions of intent – the more privacy was invaded and understand personality and intent, the less they were interested in a specific person but more in the probability and the aggregate. Viewing people as current identity and current intent, and it’s all about mathematics – there are huge amount of transactions, and the inventory become more valuable. The interactions become more diverse with the Internet of Things. The Internet become a ‘data farm’ – we started with a concept that people are valuable, to view that data is valuable and how we can extract it from people. Advertising goes into the whole commerce element.

I’ll blog about my talk ‘Algorithmic Governance in Environmental Information (or How Technophilia Shapes Environmental Democracy) later.

 Discussion:

There are issues with genetics and eugenics. Eugenics fell out of favour because of science issues, and the new genetics is claiming much more predictive power. In neuroscience there are issues about brain scans, which are not handled which are based on insufficient scientific evidence. There is an issue with discrimination – shouldn’t assume that it’s only negative. Need to think about unjustified discrimination. There are different semantic to the word. There are issues with institutional information infrastructure.