Notes from the second day of the BES/sfé annual meeting (see first day notes here)

Several talks in sessions that attracted my attention:

Daniel Richards (National University of Singapore) looked at cultural ecosystem services from social media sources. He mentioned previous study by  Casalegno at al 2013 study on social media and ecosystem services . In Singapore they carry out a study for the few green spaces that are used for leisure and nature reserves – the rest of the place is famously highly urbanised. There are patches of coastal habitat that are important locally. The analysis looked at Flickr photos to reveal interest. There are 4 study sites, with 760 photos that were returned and of them 683 related to coastal habitat. They use classification of content, with 8 people analysing the photos. Analysis of Flickr showed different aspects – landscape in one site, and wildlife in another site. In one site there are research photos due to the way it is used locally. Looking closely to one coastal site, focal points in the route where people stopped  to take a picture stood out, and landscape photos. All the photos follow the boardwalk in the area of Changi which is the only route. Simulation showed that after 70 photos they can get a good indication of the nature of the place, no need to look through all the images.

Barbara Smith explored the role of indigenous and local knowledge as part of a multiple evidence base for pollinator conservation. The context is India in agricultural area – looking at places where there is more extensive agriculture and less. The project aim is to record pollinators and then explore the impact of landscape and crop productivity . In this study, the starting point was the belief that traditional knowledge has a lot of value, and it is a knowledge that can be integrated with scientific information.  She mentioned Tengo et al 2013 discussion paper in IPBES on the value of local knowledge, and also Sutherland et al 2014 paper in Oryx about the need to integrate indigenous knowledge in ecological assessment. The aim to collate knowledge of trends, they created a local peer-review process to validate local knowledge. Understanding  factual data collection and separate it from inferences which are sometime wrong. They carry out small group discussions, in which they involved 5-7 farmers, in each of the 3 study area they had 3 groups. They asked questions that are evidence gathering (which crop you grow?) and also verification (how do you know?) they also ask opinion scoping (perceptions ) and then ‘why did you observed the change?’. In the discussions with the farmers they structured in around questions that can be explored together. After the first session, the created declarations – so ‘yields have fallen by 25%’ or crop yield declined because of the poor soil’ the statements were accepted or rejected through discussion with the farmers – local peer-review. Not all farmers can identify pollinators, and as the size goes down, there is less identification and also confusion about pests and pollinators. The farmers identified critical pollinators in their area and also suggestions on why the decline happen.

In the workshop on ‘Ecosystem assessments – concepts, tools and governance‘ there was various discussion on tools that are used for such purposes, but it became clear to me that GIS is playing a major role, and that many of the fundamental discussions in GIScience around the different types of modelling – from overlaying to process oriented modelling – can play a critical role in making sense of the way maps and GIS outputs travel through the decision making. It can be an interesting area to critically analysed – To what degree the theoretical and philosophical aspects of the modelling are taken into account in policy processes? The discussion in the workshop moved to issues of scientific uncertainty and communication with policy makers. The role of researchers in the process and the way they discuss uncertainty.

In the computational ecology session, Yoseph Araya presented a talk that was about the use of citizen science data, but instead he shared his experience and provide an interesting introduction to a researcher perspective on citizen science. He looked at the data that is coming from citizen science and the problem of getting good data. Citizen Science gaining attention – e.g. Ash die-back and other environmental issues are leading to attention. Citizens are bridging science, governance and participation. Citizen Science is needed for data at temporal, spatial and social scales and we should not forget that it is also about social capital, and of course fun and enjoyment. There is an increase in citizen science awareness in the literature. He is building on experience from many projects that he participated in include Evolution Megalab, world water monitoring day, floodplain meadows partnership, iSpot and OPAL, and CREW – Custodians of Rare and Endangered Windflowers (that’s a seriously impressive set of projects!). There are plenty of challenges – recruitment, motivation; costs and who pays; consideration of who run it; data validation and analysis and others. Data issues include data accuracy, completeness, reliability, precision and currency. He identified sources of errors – personnel, technical and statistical. The personal – skills, fitness and mistakes and others. Potential solutions – training with fully employed personnel,  then also monitor individual and also run an online quiz. Technically, there is the option of designing protocols and statistically, it is possible to use recounts (15%), protocols that allow ‘no data’ and other methods.

The poster session included a poster from Valentine Seymour, about her work linking wellbeing and green volunteering

As part of the citizens observatories conference, I represented Mapping for Change, providing an overview of community-led air quality studies that we have run over the past 4 years. Interestingly, as we started the work in collaboration with London Sustainability Exchange, and with help from the Open Air Laboratories programme the work can be contextualised within the wider context of NGOs work on citizen science, which was a topic that was covered in the conference.

The talk covered the different techniques that were used: eco-badges for Ozone testing, Wipe sampling, Diffusion tubes and particulate matter monitoring devices. In the first study, we also were assisted by Barbara Maher team who explore tree leaves for biomonitoring. The diffusion tubes are of particular importance, as the change in deployment and visualisation created a new way for communities to understand air quality issues in their area.

The use of a dense network of diffusion tubes became common in other communities over the past 4 years. I also cover the engagement of local authorities, with a year-long study in the Barbican with support from the City of London. There is a lesson about the diffusion of methodologies and approaches among community groups – with the example of the No to Silvertown Tunnel group carrying out a diffusion tubes study without linkage to Mapping for Change or London Sustainability Exchange. Overall, this diffusion mean that over 20 localised studies are emerging across London.

On 29th April, I gave a talk in the Wilson Center in Washington DC on ‘Environmental Information – the Roles of Experts and the Public. The event was organised by Lea Shanley, who is heading the ‘Commons Lab‘ initiative of the center, and Dr Jay Benforado, from the US Environment Protection Agency (EPA) provided a response to the talk.

The talk is based on a forthcoming chapter in a book that will be the final output of the EveryAware project, and I can share a copy of it if you email me.

I described the content of the talk as:  ‘Access to environmental information and use of it for environmental decision-making are central pillars of environmental democracy. Yet, not much attention is paid to the question of who is producing it, and for whom? By examining the history of environmental information, since NEPA in 1969, three eras can be identified: information produced by experts, for experts (1969-1992); information produced by experts, to be shared by experts and the public (1992-2012); and finally, information produced by experts and the public to be shared by experts and the public.

Underlying these are changes in access to information, rise in levels of education and rapid change due to digital technologies. The three eras and their implication to environmental decision-making will be explored, with special attention to the role of geographical information systems and to citizen science.’

The talk (and the chapter) are building on the themes that I discussed in a presentation during the Eye on Earth user conference in Dublin in 2013, and earlier talks in Oxford Transport Studies UnitUCL Centre for Advanced Spatial Analysis and at University College Dublin School of Geography, Planning & Environmental Policy in 2010 (see also my reflection from the Eye on Earth summit in Abu Dhabi in 2011). In the talk I covered some of the legal frameworks about production and use of environmental information, including laws and international agreements, as well as using specific demonstrations of the information systems themselves, as to demonstrate the practice. I also tried to suggest the trends that are behind the changes in the eras, and levels of education is quite central.

On reflection, the 4 years that passed since I started thinking about the ‘eras of environmental information‘ allowed me to think how to communicate them, and I hope for the better. It also made the writing up of the chapter easier, as the responses and comments that I received in previous talks provided the needed feedback and peer review to structure the text.

Although I was setting specific dates as markers for the eras, the reality is that the boundaries are more flexible and the transition was over time – it is especially difficult for the latest transition of public participation in environmental information production.

The talk was followed by a discussion that lasted almost 45 minutes, and during the discussion, the common issue of data quality of citizen science data or the interesting point about the issue of dissemination as Rob Baker noted: ‘Is the role of experts as facilitators extend to dissemination of information or just collection? Who closes the loop? ‘  (https://twitter.com/rrbaker/statuses/461157624592203776) or Susan Wolfinbarger question about citizen science: ‘How do you know when the quality of a #citsci project is bad?’  (https://twitter.com/SWolfinbarger/statuses/461166886043262976)

The presentation and discussion were captured on YouTube, below

and the slides are available on SlideShare

Kate Chapman posted an interesting reflection to the talk over at H.O.T website.

The Guardian’s Political Science blog post by Alice Bell about the Memorandum of Understanding between the UK Natural Environment Research Council and Shell, reminded me of a nagging issue that has concerned me for a while: to what degree GIS contributed to anthropocentric climate change? and more importantly, what should GIS professionals do?

I’ll say from the start that the reason it concerns me is that I don’t have easy answers to these questions, especially not to the second one.  While I personally would like to live in a society that moves very rapidly to renewable energy resources, I also take flights, drive to the supermarket and benefit from the use of fossil fuels – so I’m in the Hypocrites in The Air position, as Kevin Anderson defined it. At the same time, I feel that I do have responsibility as someone who teaches future generations of GIS professionals how they should use the tools and methods of GIScience responsibly. The easy way would be to tell myself that since, for the past 20 years, I’ve been working on ‘environmental applications’ of GIS, I’m on the ‘good’ side as far as sustainability is concerned. After all, the origins of the biggest player in our industry are environmental (environmental systems research, even!), we talk regularly about ‘Design With Nature’ as a core text that led to the overlays concept in GIS, and we praise the foresight of the designers of the UNEP Global Resource Information Database in the early 1980s. Even better, Google Earth brings Climate Change information and education to anyone who want to downloaded the information from the Met Office.

But technologies are not value-free, and do encapsulate certain values in them. That’s what critical cartography and critical GIS has highlighted since the late 1990s. Nadine Schuurman’s review is still a great starting point to this literature, but most of it analysed the link of the history of cartography and GIS to military applications, or, in the case of the volume ‘Ground Truth’, the use of GIS in marketing and classification of people. To the best of my knowledge, Critical GIScience has not focused its sight on oil exploration and extraction. Of course, issues such as pollution, environmental justice or environmental impacts of oil pipes are explored, but do we need to take a closer look at the way that GIS technology was shaped by the needs of the oil industry? For example, we use, without a second thought, the EPSG (European Petroleum Survey Group) definitions of co-ordinates reference systems in many tools. There are histories of products that are used widely, such as Oracle Spatial, where some features were developed specifically for the oil & gas industry.  There are secretive and proprietary projections and datums, and GIS products that are unique to this industry. One of the most common spatial analysis methods, Kriging, was developed for the extractive industry. I’m sure that there is much more to explore.

So, what is the problem with that, you would say?

Well Architect

Fossil fuels – oil, coal, gas – are at the centre of the process that lead to climate change. Another important thing about them is that once they’ve been extracted, they are likely to be used. That’s why there are calls to leave them in the groundWhen you look at the way explorations and production work, such as the image here from ‘Well Architect‘, you realise that geographical technologies are critical to the abilities to find and extract oil and gas. They must have played a role in the abilities of the industry to identify, drill and extract in places that were not feasible few decades ago. I remember my own amazement at the first time that I saw the complexity of the information that is being used and the routes that wells take underground, such as what is shown in the image (I’ll add that this was during an MSc project sponsored by Shell). In another project (sponsored by BP), it was just as fascinating to see how paleogeography is used for oil exploration. Therefore, within the complex process of finding and extracting fossil fuels, which involves many engineering aspects, geographical technologies do have an important role, but how important? Should Critical GIScientists or the emerging Critical Physical Geographers explore it?

This brings about the more thorny issue of the role of GIS professionals today and more so with people who are entering the field, such as the students who are studying for an MSc in GIS, and similar programmes. If we accept that most of the fossil fuels should stay underground and not be extracted, than what should we say to students? If the person that involved in working to help increasing oil production does not accept the science of climate change, or doesn’t accept that there is an imperative to leave fossil fuels in the ground, I may accept and respect their personal view. After all, as Mike Hulme noted, the political discussion is more important now than the science and we can disagree about it. On the other hand, we can take the point of view that we should deal with climate change urgently and go on the path towards reducing extraction rapidly. In terms of action, we see students joining campaigns for fossil free universities, with which I do have sympathy. However, we’re hitting another difficult point. We need to consider the personal cost of higher education and the opportunity for well paid jobs, which include tackling interesting and challenging problems. With the closure of many other jobs in GIS, what is the right thing to do?

I don’t have an easy answer, nor can I say that categorically I will never work with the extractive sector. But when I was asked recently to provide a reference letter by a student in the oil and gas industry, I felt obliged to state that ‘I can completely understand why you have chosen this career, I just hope that you won’t regret it when you talk with your grandchildren one day in the future’

The Eye on Earth first user conference, which was in Dublin at the beginning of March, was as interesting as the first summit in Abu Dhabi, in December 2011. Significantly, in the conference the role of citizen science in environmental monitoring and the creation of useful environmental information was highlighted from the opening address by Prof Jacquie McGlade, the head of the European Environment Agency to the final statement of the meeting which stated that the Eye on Earth Network see “citizen science as an important source of knowledge within the diversity of knowledge communities“.

I’ve been following the Eye on Earth network with a lot of interest: with the combination of environmental information for public access, use of GIS and the integration of citizen science, it is dealing with many of my research interests over the past 15 years. I was not surprised to find the conference and the discussions during it very stimulating.

As the conference progressed and more and more examples were given on how effortlessly information can be accessed through “the cloud” I became aware that there was a hidden partner to the whole process and that it’s role is generally being ignored: computing doesn’t happen in the Ether, and does have environmental consequences – as the New York Times investigation explored. It was valuable to hear about Microsoft environmental activities at the end of the conference, but that was done in not a completely connected way. So the issue with environmental information is that there is a need to use the systems that are being used to collect, manage and share environmental information into exemplars of  ‘deep green computing’. A lot of the data is paid for by public sector bodies, and contracts can include demands on increasing environmental performances as an integral part of dealing with this information. Otherwise, the information itself can be part of the problem instead of part of the solution!

It is possible, even at a small scale. In Mapping for Change, we needed to change hosting provider and it was clear to us that we need to do things right, so we set out to look for a provider that is reliable but also respecting the values of the business itself (both social and environmental). This has reduced the number of possible providers, but we are now switching over to ecohosting who demonstrate that it is possible to provide web hosting with suitable environmental standards.

Recently, I attended a meeting with people from a community that is concerned with vibration and noise caused by a railway near their homes. We have discussed the potential of using citizen science to measure the vibrations that pass the sensory threshold and that people classify as unpleasant, together with other perceptions and feeling about these incidents. This can form the evidence to a discussion with the responsible authorities to see what can be done.

As a citizen science activity, this is not dissimilar from the work carried out around Heathrow to measure the level of noise nuisance or air pollution monitoring that ExCiteS and Mapping for Change carried out in other communities.

In the meetings, the participants felt that they need to emphasise that they are not against the use of the railway or the development of new railway links. Like other groups that I have net in the past, they felt that it is important to emphasise that their concern is not only about their locality – in other words, this is not a case of ‘Not In My Back Yard’ (NIMBY) which is the most common dismissal of local concerns. The concern over NIMBY and citizen science is obvious one, and frequently come up in questions about the value and validity of data collected through this type of citizen science.

During my masters studies, I was introduced to Maarten Wolsink (1994) analysis of NIMBY as a compulsory reading in one of the courses. It is one of the papers that I keep referring to from time to time, especially when complaints about participatory work and NIMBY come up.
Inherently, what Wolsink is demonstrating is that the conceptualisation of the people who are involved in the process as selfish and focusing on only their own area is wrong. Through the engagement with environmental and community concerns, people will explore issues at wider scales and many time will argue for ‘Not in Anyone’s Back Yard’ or for a balance between the needs of infrastructure development and their own quality of life. Studies on environmental justice also demonstrated that what the people who are involved in such activities ask for are not narrow, but many times mix aspects of need for recognition, expectations of respect, arguments of justice, and participation in decision-making (Schlosberg 2007).

In other words, the citizen science and systematic data collection are a way for the community to bring to the table evidence that can enhance arguments beyond NIMBY, and while it might be part of the story it is not the whole story.

For me, these interpretations are part of the reason that such ‘activism’-based citizen science should receive the same attention and respect as any other data collection, most notably by the authorities.

Wolsink, M. (1994) Entanglement of Interests and Motives: Assumptions Behind the NIMBY-Theory on Facility Siting, Urban Studies, 31(6), pp. 851-866.
Scholsberg, D. (2007) Defining Environmental Justice: Theories, Movements, and Nature. Oxford University Press, 2007

Follow

Get every new post delivered to your Inbox.

Join 2,697 other followers