UCL Institute for Global Prosperity Talk: Extreme Citizen Science – Current Developments

The slides below are from a talk that I gave today at UCL Institute for Global Prosperity

The abstract for the talk is:

With a growing emphasis on civil society-led change in diverse disciplines, from International Development to Town Planning, there is an increasing demand to understand how institutions might work with the public effectively and fairly.

Extreme Citizen Science is a situated, bottom-up practice that takes into account local needs, practices and culture and works with broad networks of people to design and build new devices and knowledge creation processes that can transform the world.

In this talk, I discussed the work of UCL Extreme Citizen Science group within the wider context of the developments in the field of citizen science. I covered the work that ExCiteS has already done, currently developing and plans for the future.

Citizen Science Data & Service Infrastructure

Following the ECSA meeting, the Data & tools working group workshop was dedicated to progressing the agenda on data & infrastructure.

Jaume Piera (chair, Data and Tools working group of ECSA) covered the area of citizen science data – moving from ideas, to particular solutions, to global proposals – from separate platforms (iNaturalist, iSpot, GBIF, eBird) but the creation of different citizen science associations and the evolution of ideas for interoperability, can allow us to consider the ‘Internet of People# which is about participatory sharing of data. We can work in similar way to standards development in the area of the internet, and starting to consider the layers: interoperability, privacy/security, data reliability, infrastructure sustainability, data management, intellectual property rights, engagement, Human-Computer Interaction, Reference models and testing. By considering these multiple layers, we can develop a roadmap for development and consider a range of solutions at different ‘layers’. The idea is to open it to other communities – and aim to have solutions that are discussed globally.

Arne Berra explained the CITI-SENSE platform. There is a paper that explains the architecture of CITI-SENSE on the project site. He proposed that we use the European Interoperability Framework — legal, organisational, semantic and technical. in the technical area, we can use ISO 19119 and OGC – with 6 areas: boundary, processing/analytics, data/model management, communication, systems. We can use reference models. Also suggested considering the INSPIRE life cycle model. There is a challenge of adapting standards into the context of citizen science, so in many ways we need to look at it as conceptual framework to consider the different issues and consider points about the issues. In CITI-SENSE they developed a life cycle that looked at human sensor data services, as well as the hardware sensor application platform.


Ingo Simonis (OGC) – a standardised encoding to exchange citizen science data. He describe work that OGC is doing in sensor web for citizen science, and they collected data from different projects. Through citizen science data, information come from different surveys, in different forms and structures. The requirements are to have citizens + environment + sensor. Who did particular measurement? We want to know about the environment – e.g. that it was rainy while they collected the data, and then know about the sensor. So OGC O&M citizen observatories model is conceptual. It’s an observation model – assigning a value to a property – they also look at standards for sensors – OGC SensorML. He used the ISO 19100 series of standards. The observation model is trying to address issues of observations that are happening offline and then being shared. The model also deal with stationary and mobile sensing activities, and allowing for flexibility – for example having ad-hoc record that is not following specific process.


Alex Steblin – The Citclops project includes applications such as Eye on Water (eyeonwater.org). The Citclops have a challenge of maintaining the project’s data once the project finished.

Veljo Runnel covered EU BON work (www.eubon.eu) – mobilising biodiversity ata is challenges. They want a registry of online tools for citizen science projects – tool that will allow people who work with citizen science to record information about the project as related to biodiversity – such as link to GBIF, recording DNA, use of mobile app. Finding the person that run the tool is difficult. On EU BON they have ‘data mobilization helpdesk’, the elements of the standard were discussed within the the EU BON consortium and how they are going to explore how to provide further input.

JRC is exploring the possibility of providing infrastructure for citizen science data – both metadata and the data itself.

Translation of technical information into a language that is accessible is valuable for the people who will be using it. We need to find ways to make information more accessible and digestible. The aim is to start developing reference material and building on existing experiences – sub divide the working group to specific area. There are many sub communities that are not represented within the data groups (and in ECSA) and we need to reach out to different communities and have including more groups. There are also issues about linking the US activities, and activities from the small-scale (neighbourhoods) to large organisations. As we work through information, we need to be careful about technical language, and we need to be able to share information in an accessible way.

Environmental information: between scarcity/abundance and emotions/rationality

The Eye on Earth Summit, which was held in Abu Dhabi last week, allowed me to immerse myself in the topics that I’ve been researching for a long time: geographic information, public access to environmental information, participation, citizen science, and the role of all these in policy making. My notes (day 1 morning, day 1 afternoon, day 2 morning, day 2 afternoon, day 3 morning & day 3 afternoon) provide the background for this post, as well as the blog posts from Elisabeth Tyson (day 1, day 2) and the IISD reports and bulletins from the summit. The first Eye on Earth Summit provided me with plenty to think about, so I thought that it is worth reflecting on my ‘Take home’ messages.

What follows are my personal reflections from the summit and the themes that I feel are emerging in the area of environmental information today. 

wpid-wp-1444166132788.jpgWhen considering the recent ratification of the Sustainable Development Goals or SDGs by the UN Assembly, it is not surprising that they loomed large over the summit – as drivers for environmental information demand for the next 15 years, as focal points for the effort of coordination of information collection and dissemination, but also as an opportunity to make new links between environment and health, or promoting environmental democracy (access to information, participation in decision making, and access to justice). It seems that the SDGs are very much in the front of the mind of the international organisations who are part of the Eye on Earth alliance, although other organisations, companies and researchers who are coming with more technical focus (e.g. Big Data or Remote Sensing) are less aware of them – at least in terms of referring to them in their presentations during the summit.

Beyond the SDGs, two overarching tensions emerged throughout the presentations and discussions – and both are challenging. They are the tensions between abundance and scarcity, and between emotions and rationality. Let’s look at them in turn.

Abundance and scarcity came up again and agin. On the data side, the themes of ‘data revolution’, more satellite information, crowdsourcing from many thousands of weather observers and the creation of more sources of information (e.g. Environmental Democracy Index) are all examples for abundance in the amount of available data and information. At the same time, this was contrasted with the scarcity in the real world (e.g species extinction, health of mangroves), scarcity of actionable knowledge, and scarcity with ecologists with computing skills. Some speakers oscillated between these two ends within few slides or even in the same one. There wasn’t an easy resolution for this tension, and both ends were presented as challenges.


With emotions and scientific rationality, the story was different. Here the conference was packed with examples that we’re (finally!) moving away from a simplistic ‘information deficit model‘ that emphasise scientific rationality as the main way to lead a change in policy or public understanding of environmental change. Throughout the summit presenters emphasised the role of mass media communication, art (including live painting development through the summit by GRID-Arendal team), music, visualisation, and story telling as vital ingredients that make information and knowledge relevant and actionable. Instead of a ‘Two Cultures’ position, Eye on Earth offered a much more harmonious and collaborative linkage between these two ways of thinking and feeling.

Next, and linked to the issue of abundance and scarcity are costs and funding. Many talks demonstrated the value of open data and the need to provide open, free and accessible information if we want to see environmental information used effectively. Moreover, providing the information with the ability of analyse or visualise it over the web was offered as a way to make it more powerful. However, the systems are costly, and although the assessment of the IUCN demonstrated that the investment in environmental datasets is modest compared to other sources (and the same is true for citizen science), there are no sustainable, consistent and appropriate funding mechanisms, yet. Funding infrastructure or networking activities is also challenging, as funders accept the value, but are not willing to fund them in a sustainable way. More generally, there is an issue about the need to fund ecological and environmental studies – it seem that while ‘established science’ is busy with ‘Big Science’ – satellites, Big Data, complex computer modelling – the work of studying ecosystems in an holistic way is left to small group of dedicated researchers and to volunteers. The urgency ad speed of environmental change demand better funding for these areas and activities.

This lead us to the issue of Citizen Science, for which the good news are that it was mentioned throughout the summit, gaining more prominence than 4 years ago in the first summit (were it also received attention). In all plenary sessions, citizen science or corwdsourced geographic information were mentioned at least once, and frequently by several speakers. Example include Hermes project for recording ocean temperatures, Airscapes Singapore for urban air quality monitoring, the Weather Underground of sharing weather information, Humanitarian OpenStreetMap Team work in Malawi, Kathmandu Living Lab response to the earthquake in Nepal, Arab Youth Climate Movement in Bahrain use of iNaturalist to record ecological observations, Jacky Judas work with volunteers to monitor dragonflies in Wadi Wurayah National Park  – and many more. Also the summit outcomes document is clear:  “The Summit highlighted the role of citizen science groups in supporting governments to fill data gaps, particularly across the environmental and social dimensions of sustainable development. Citizen Science was a major focus area within the Summit agenda and there was general consensus that reporting against SDGs must include citizen science data. To this end, a global coalition of citizen science groups will be established by the relevant actors and the Eye on Earth Alliance will continue to engage citizen science groups so that new data can be generated in areas where gaps are evident. The importance of citizen engagement in decision-making processes was also highlighted. ”

However, there was ambivalence about it – should it be seen as an instrument, a tool to produce environmental information or as a mean to get wider awareness and engagement by informed citizens? How best to achieve the multiple goals of citizen science: raising awareness, educating, providing skills well beyond the specific topic of the project, and democratising decision making and participation? It seem to still be the case that the integration of citizen science into day to day operations is challenging for many of the international organisations that are involved in the Eye on Earth alliance.

Another area of challenging interactions emerged from the need for wide partnerships between governments, international organisations, Non-Governmental Organisations (NGOs), companies, start-ups, and even ad-hoc crowds that respond to a specific event or an issue which are afforded by digital and social network. There are very different speeds in implementation and delivery between these bodies, and in some cases there are chasms that need to be explored – for example, an undercurrent from some technology startups is that governments are irrelevant and in some forms of thinking that ‘to move fast and break things’ – including existing social contracts and practices – is OK. It was somewhat surprising to hear speakers praising Uber or AirBnB, especially when they came from people who familiar with the need for careful negotiations that take into account wider goals and objectives. I can see the wish to move things faster – but to what risks to we bring by breaking things?

With the discussions about Rio Principle 10 and the new developments in Latin America, the Environmental Democracy Index, and the rest, I became more convinced, as I’ve noted in 2011, that we need to start thinking about adding another right to the three that are included in it (access to environmental information, participation in decision-making, and access to justice), and develop a right to produce environmental information that will be taken seriously by the authorities – in other words, a right for citizen science. I was somewhat surprised by the responses when I raised this point during the discussion on Principle 10.

Final panel (source: IISD)

Finally, Eye on Earth was inclusive and collaborative, and it was a pleasure to see how open people were to discuss issues and explore new connections, points of view or new ways of thinking about issues. A special point that raised several positive responses was the gender representation in such high level international conference with a fairly technical focus (see the image of the closing panel). The composition of the speakers in the summit, and the fact that it was possible to have such level of women representation was fantastic to experience (making one of the male-only panels on the last day odd!). It is also an important lesson for many academic conferences – if Eye on Earth can, I cannot see a reason why it is not possible elsewhere.

Being philosophical about crowdsourced geographic information


This is a post by Renee Sieber and myself, providing a bit of a background on why we wrote the paper “The epistemology(s) of volunteered geographic information: a critique” – this is in addition to what I’ve written about it in this blog post

Originally posted on Geo: Geography and Environment:

By Renée Sieber (McGill University, Canada) and Muki Haklay (University College London, UK)

Our recent paper, The epistemology(s) of volunteered geographic information: a critique, started from a discussion we had about changes within the geographic information science (GIScience) research communities over the past two decades. We’ve both been working in the area of participatory geographic information systems (GIS) and critical studies of geographic information science (GIScience) since the late 1990s, where we engaged with people from all walks of life with the information that is available in GIS. Many times we’d work together with people to create new geographic information and maps. Our goal was to help reflect their point of view of the world and their knowledge about local conditions, not always aim for universal rules and principles. For example, the image below is from a discussion with the community in Hackney Wick, London, where individuals collaborated to…

View original 819 more words

Eye on Earth (Day 2 – Afternoon) – Cost of knowledge, citizen science & visualisation

The first afternoon session was dedicated to Understanding the Costs of Knowledge – Cost of Data Generation and Maintenance (my second day morning post is here)

DSCN1220The session was moderated by Thomas Brooks (IUCN) – over the last couple of days we heard about innovation in mobilisation of environmental and socio-economic data. All these innovations have price tag, and some are quite large. Need budget for it and pay for it accordingly. Establishing costs for knowledge products in biodiversity is important. First, four products are explored and then the costs analysed.

DSCN1221Richard Jenkins – IUCN read list of Threatened Species. He explain the list and the essential procedures and components that created it. The red list is a framework for classifying threatened species with different classifications with vulnerable, engendered or critically engendered are included in the list. It’s critical source for conservation – over 75,000 species, with over 3,000,000 people visiting the website each year to find information. The foundation of the information is a structured process with ongoing cycles of evaluation and analysis. They are based on donor support – volunteer time in data collections, as well as professional time to evaluate the information and running an on-line database. Costs include workshops, training and travel, for professional time there is communications, researchers, developers, fund raisers and ICT costs: hosting, maintenance, software licensing, hardware etc. The costs can be one-off (setting new system), recurring costs (evaluations) and annual costs (systems and people). Need partnerships, voluntarism – essential and need to be recognised. Re-assessment are needed and also developing tools & uptake

DSCN1222Jon Paul Rodriguez – IUCN Red List of Ecosystems, as an emerging product – ecosystem collapse is transformation beyond typical situation. Example for this is Aral Sea – with impact on wildlife and human life around it. They use a risk model for ecosystems with 4 symptoms as criteria. Similar categories to the species red list. They do global assessment at continental scale and national scale. Costs: compilation of data which are spatial information is complex, time consuming and challenging. There is economy of scale is you do it at regional / global analyses, and first assessment is costly, but updates will be cheaper. The benefits: ecosystem mapping can be used for other knowledge products (e.g. protected areas), capacity-building model, and doing it with open access data. The potential of integration with the two red lists there is a more effective products. Commercial users will need to pay.

Ian May – birdlife international –  key biodiversity areas (KBAs). Set of information about sites that are identified for biodiversity conservation using standard criteria by a range of bodies. There are important bird areas, critical ecosystem partnership fund areas (particular hotspots in multiple taxa). Future direction is to standardise the KBAs. They are used into IFC Performance Standard 6 that force development banks to take them into account, they are integrated in Natura 2000 Birds Directive and in CBD Aichi Targets.

DSCN1224Naomi Kingston – WCMC – protected area (Protected Planet product) – it’s a project about deliver, connect, analyse and change – world database on protected areas. Have been in development since 1959, evolving from list of national parks and equivalent reserves. There are 700 data providers globally but also NGOs and community groups. Database that evolved over time need to be treated carefully and consider what each polygon and point mean. There is 91.3% polygon data, and grown from 41,305 sites in 1998 to 200,000 today. They raise profile through different activities. There is a website – www.protectedplanet.net . Data is supposed to be updated every 5 years, and is used in SDGs, academic research and strategic plan for biodiversity. They want to see decisions that are based on it – e.g. IBAT that support business. There is direct connection between resources that are available to the ability to provide training, outreach and capacity building .

DSCN1225Dieggo Juffe – costing the knowledge products – he assessed the financial investment in developing and maintaining biodiversity information. evaluating development costs to 2013, maintenance and future costs. The datasets that were covered are used in decision making, academic research and more. They developed methodology to evaluate primary data collection costs, network supporting costs, national red lists of species, and the costs of producing scientific papers. They looks on different aspects: personnel, infrastructure, workshop & travel and publication and outreach, looking at all the funding – from donors, private sector, government, NGOs etc., including volunteer time and converted it to USD in 2014. Looked at data since the 1980s to 2013. Today, investment between $116 to $204 USD in development and maintenance. 67,000 to 73,000 volunteer days – almost 200 years. Annual investment 6.5 Mil and 12.5 volunteer days/year . Most was funded from philanthropy (53%) and government 27%. Very large investment in personnel. They exect that future investment to 2020 will be in the range of 100 mil USD. That will give us a comprehensive baseline. Without data we can’t make decision, This is very small compare to census running to other systems. Some of the open questions: what’s the impact of this investment? are there better way to make the products even more cost-effective? what is the real cost of volunteer time? How to avoid duplication of effort?

wpid-wp-1444253313774.jpgA second afternoon session focused on Everyone is a supplier: Crowd-sourcing and citizen science and indigenous knowledge. Craig Hanson (WRI) opened with a comment that there is a lot of data from remote sensing, professional scientists – but what the role of citizens? there are 7 billion mobile phone and worldwide and with near global Internet connectivity, citizens anywhere are now capable of being the eyes and ears of the planet.  The session looked at successful approaches for engaging people to crowd-source data and contribute to citizen science, and how indigenous knowledge can be systematically integrated into decision-making. With applications from around the world. WRI is  also involved in this process, and in global forest watch – started from partners processing data, but satellite can’t see everything, and JGI and WRI use ODK  to provide ground truth on forest clearing.

Jacquie McGlade covered UNEP Live – citizen science mentioned many time in the summit, but now we need to make voices heard. We need alternative models of how the world operate. All UNEP assessment will include alternative views of mother earth – a challenge for western science point of view. UNEP Live was designed to give citizen access to data that was collected by governments, but now it also include citizen science – there are now legislations that include rights for people to gather data and making sure that these data are used in decision making. It’s all about co-production of knowledge. From the structured world with metadata and schema to the unstructured data of social media and NGOs. The idea of co-prodcution of knowledge, require management of knowledge with ontologies, and noticing 23 different definition of legal, many definition of access or forest and this is a challenge. SDG interface catalogue is providing the ontology. Example from climate change in the Arctic or in species monitoring in ecosystem capital account that involve forest communities. Motivating people is important – air quality is a great opportunity for citizen science with local interest with information. People in Kibera were willing to pay for access to air quality equipment as they see it as important for their children.

Brian Sullivan (Google Earth Outreach) – everyone is supplier. Indigenous groups using tools for telling stories, environmental monitoring and the protected area of the Surui is been included in partnership with Google. They’ve done cultural mapping with the Surui and worked with other communities who decide if they want to make it public or private. Environmental monitoring was another activity – using ODK. They build resource use and other information that help to protect the land. They are working with other groups in Brazil. Another project is Global Fishing Watch – visualising fishing fleet. Using machine learning, they have been monitoring fishing, and it also allow you to zoom in to specific ship. Monitoring areas when there are limited resources and they can’t enforce by sending ship.

wpid-wp-1444253326705.jpgTunitiak Katan looking at his tribal territory in Ecuador – the national context, indigenous people, in climate change and measurement. Ecuador have many indigenous groups – 11 different cultures. He was involved in carbon estimation and ecosystem assessment. Working with different groups using traditional ecological knowledge (ancestors knowledge). The explore the issues of climate discussions with different groups from 9 cultures, with 312 people discussing REDD/REDD+. They carried out measurements in the Amazon demonstrating carbon capture. Now they carry out a project at Kutukú-Shaim region for conservation, restoration and management, selected because the area got a lot of rivers that feed the Amazon river. They aim to achieve holistic management. “We and the forest are one”.

Nick Wright from @crwodicity – belief that in each organisation or community that are transformative ideas that are not seeing the light of day. We are more connected than ever before. Technology change the way people link and interact and becoming the norm. Connectivity make technologies part of the solution, and the vast majority of the world will benefit from this connectivity. It’s about not just collecting the information but also to connect the dots and make sense of it. Increase connectivity is challenging hierarchy. How can citizens participate in decision making and opportunity to participate. The crowdsourcing is a way to strengthen relationship between government and the people. Crowdicity worked with Rio to explore the Olympic legacy. They created Agora Rio to allow people to discuss issues and make the city better. They started on-line and move to the real world – pop-up town hall meetings – coordinate community groups and reach out from the on-line to those who didn’t access.They had a process to make it possible to work on-line and off-line. Led to 24 proposals for projects, of which 4 are going forward and done in cycle of 12 week. The importance is to create social movement for the period of time – sense of energy. Crowdsourcing can work in the UN system – post-2015 development agenda, help to amplify the conversation to 16 million people around the world – take views from across the world – BYND 2015 is the first ever crowdsourced UN declaration.

Andrew Hill of @cartodb covering the importance of citizen science in Planet Hunters, but wanted to mostly wanted to talk maps. How to engage people who can contribute code or technical skills. GitHub is a system that is central to technology working. Successful project can have many participants. It’s a community of 10 million users. How can we find coders for my project? But lots of time there is lack of contribution apart from the lead? We need to engage people to create technologies for communities. Hackathon can be problematic without thinking beyond the specific event. Need to consider small grant, and also thinking about people somewhere between code and use. Maps might be the data visualisation type that change people behaviour most often. Maybe a tool to make things easy – it should be a map? Website like timby.org can allow people to tell their story. CartoDB also make it possible for people to take data and show it in different ways.

Discussion: getting to the idea is possible, but then there is a challenge is to keep them engaged. Suggestion: give information back and see the value in information. Need to have feedback loop for people to see what they learned, building expertise, A personal journey of learning is important.

The final plenary was Reaching audiences through innovations in visualisation for people to act on information, they need to understand it. Visualisation can increase that understanding. Bringing together leading experts and practitioners, this plenary will showcase innovations in data visualisation and application that advance sustainable development.


Janet Ranganathan shown the WRI Resource Watch. There is a gap between data provision and data use – a lot of open data portals – you get lost. Need to help people to listen to the signal of the planet and act on it. The opportunity is the whole data that is coming out. Based on global forest watch, they focus on the Nexus: water, food, energy, forests. Provide access to data, but also analysis and then sharing the insights.

Craig Mills talked about visuality experience – it’s not data revolution but it’s about presenting information. Need to create fusion between data and story telling. He provided a walk through of ResearchWatch showing how to make information personal and need to redefine of displaying maps – following convention from GIS. There are ways of thinking about visualisation principles. Stop to think about sharing – see the connection before things are displayed on the map. How to get your data to where people are already using. Make it easy to embed in other places – make a big share button. Use emotions and feeling in terms of connection. Context is the secret – expect people to use things on phones, or tablet. Actually thinking about information as mobile first. Also voice activated and SMS and we can reach everyone

Angela Oduor Lungati – Ushahidi – explore the marginalisation is not from scarcity, but poverty, power and inequality (UN Human Development Report 2006). She show how privatisation of water reduce access to water. Usahidi is a platform that allow ordinary citizens to raise their voice and share information. Information can use SMS, web or smartphone – whatever people have. Allowing data collection, management, visualisation and alerts. Pothole theory – there is an event that trigger your action – and need to be local and personal. Kathmandu Living Labs use Ushahidi to find proper assessment in QuakeMap.org. The tool is also used by theLouisiana Bucket Brigade. Usahidi was used by 18M people and 159 countries, and it is made in Africa. Suggest the metaphor of data = seeds; land = platforms and farmers are the people. Technology just 10% of the solution.

Trista Patterson – NewMedia Lab at GRID-Arendal – history of many reports and viral graphics. NewMedia Lab is to invigorate radical experimentation & rapid prototyping – moving beyond paper focus design. Connecting people with data, the audience and emotions. Dependence on technology increase, instead of envisioning what it is that we deeply need most – our need for envisioning, and we need to exercise this capability. They explore relationship with artists, envisioning with children. Data + emotions = decisions and actions. Iterations and endurance in experimentations.

The last side event Citizen Scientists and their role in monitoring of local to global environmental change – explored project in Abu Dhabi that involves divers in recording data about sharks and a project in Bahrain – regional movement of Arab Youth Climate Movement. Citizen Science programme, choose to use iNaturalist in Bahrein as a way to make people less blind to nature. Use iNaturalist, small session open to the public in a natural world heritage site – introduce the concept of citizen science which is not known to the public, and let them use the app to help to identify species, and would like to see people engage from a younger age in citizen science. Challenge in Abu Dhabi with an engagement with divers monitoring sharks when the Gulf is major exporter of fins. Initiatives take time to develop, and in Abu Dhabi they have challenge that divers are ex-pat who stay for some years and then leave, so require to continue to recruit people.

New paper: The epistemology(s) of volunteered geographic information: a critique

Considering how long Reneé Sieber  (McGill University) and I know each other, and working in similar areas (participatory GIS, participatory geoweb, open data, socio-technical aspects of GIS, environmental information), I’m very pleased that a collaborative paper that we developed together is finally published.

The paper ‘The epistemology(s) of volunteered geographic information: a critique‘ took some time to evolve. We started jotting ideas in late 2011, and slowly developed the paper until it was ready, after several rounds of peer review, for publication in early 2014, but various delays led to its publication only now. What is pleasing is that the long development time did not reduced the paper relevancy – we hope! (we kept updating it as we went along). Because the paper is looking at philosophical aspects of GIScience, we needed periods of reflection and re-reading to make sure that the whole paper come together, and I’m pleased with the way ideas are presented and discussed in it. Now that it’s out, we will need to wait and see how it will be received.

The abstract of the paper is:

Numerous exegeses have been written about the epistemologies of volunteered geographic information (VGI). We contend that VGI is itself a socially constructed epistemology crafted in the discipline of geography, which when re-examined, does not sit comfortably with either GIScience or critical GIS scholarship. Using insights from Albert Borgmann’s philosophy of technology we offer a critique that, rather than appreciating the contours of this new form of data, truth appears to derive from traditional analytic views of information found within GIScience. This is assisted by structures that enable VGI to be treated as independent of the process that led to its creation. Allusions to individual emancipation further hamper VGI and problematise participatory practices in mapping/geospatial technologies (e.g. public participation geographic information systems). The paper concludes with implications of this epistemological turn and prescriptions for designing systems and advancing the field to ensure nuanced views of participation within the core conceptualisation of VGI.

The paper is open access (so anyone can download it) and it is available in the Geo website . 

Citizen Cyberlab Summit (day 1)

wpid-wp-1442503181050.jpgThe Citizen Cyberlab Summit is the final event of the Citizen Cyberlab project. The name might sound grand, but the event itself was fairly intimate and focused, with about 40 participants from across the world. The aim of the event was to share the learning from the project and compare them to similar activities around the world. It also provided an opportunity to consider, with experts from different areas, the directions that the project partners should progress beyond the specific ‘deliverables’ (outcomes and outputs) of the project. The meeting was held in the Confucius institute of the University of Geneva which has a mission to improve scientific diplomacy and international links between researchers, so it was a suitable venue for the such international scientific meeting.

 Introduction to Citizen Cyberlab was provided by Ariel Lindner (UPD) who is the main project leader. He noted that the starting point of citizen cyberlab is that we know that people learn better by doing, and that working with the public is also beneficial for scientists – both for becoming aware of public concerns as well as the moral obligation to share the results of research with those who fund it.  The citizen cyberlab project, which is in its final months, was based on 3 parts – platforms, pilots, and tools. Platforms that are aimed at lowering the barriers for participation for scientists and citizens (computation and participation platforms). The platforms are tested through pilot projects, which are then evaluated for creativity and learning – exploring learning behaviour, creativity and community engagement. We aim to share the successful experiences but also the challenges that emerged through the various activities. In the computation platforms, we developed CitizenGrid is aimed to allow running cloud-based projects; RedWire, a new way to consider game design – creating an open source game engine with open game analytics (the ability to measure what people do with the games). Example of this was in the development of science games; GeoKey is the final platform, and it allow people to share their concerns and control information. The project pilots included Virtual Atom Smasher which is about learning particle physics and helping scientists; GeoTag-X at UNITAR helping in disaster response; SynBio4All which open up synthetic biology to wider audience – with games such as Hero Coli and a MOOC on DIY synthetic biology (through iGEM) – with activities around ‘the smell of us’ about the odour that people emit and identifying the bacteria that influence it. L’Oréal is interested in developing this research further; There are several Extreme Citizen Science pilots, too. The tools that were developed in the project included creativity tools such as IdeaWeave.io to explore and develop ideas, monitoring learning (CCL-Tracker), and EpiCollect+ system to allow data collection for a wide range of projects.
Aspects of creativity and understanding what people learn are both complex tasks – understanding the learning had to be done on other communities in citizen science, finally there is specific effort on community engagement through social media and media outlets (YouTube and Audio).

The rest of the event was structured as follows: after two short presentations from guest speakers from outside the project consortium, two demonstrations of specific platform, tool, pilot or learning was followed, and the session ended with discussion in groups, which were then shared back. In all, the summit had 4 such sessions.

wpid-wp-1442502888908.jpgFollowing this introduction, two guests gave Short Talks, first about World Community Grid (WCG) – Juan Hindo (IBM). Juan provided details of WCG which is part of IBM corporate citizenship group. WCG is philanthropic programme that support participation in science through distributed computing to allow scientists to access large scale computing by using unused processing in computers and mobile devices. The projects can be ‘the biggest and most fundamentally important activities in labs’ according to researchers who participate in the programme. Examples of success include new solar materials from Harvard university researchers, with thousands of candidate materials. Other breakthroughs happened in childhood cancer research and computing for clean water that was led by Tshinghua University in China – exploring the use of nano-tubes for water filtration. WCG are promoting Open Science – ask researcher to make the data publicly available, focus on humanitarian research, real tangible science, with IBM support. Using the corporate ability, they get lots of attention in media. They try to engage volunteers as much as possible – they carried out an extensive volunteers study 2 years ago. Demographic – mostly man, technical background, 20-40, who usually volunteer for 5 years, and people join because they want to help science. Learning about the science is a reason to stay. People want to understand the impact of the computations that they perform – beyond just statics and asking information to be understandable. WCG are trying now to build a more diverse volunteer base, more approachable scientific content and articulating the value of contribution. They see opportunity to reach out to young people, women and they try to engage people through the story about the science, and ensuring people that the process is safe – evaluating experience and design to take a short time. They also want to leverage existing volunteers – they set up a recruitment competition for existing volunteers – that led to very few new people joined. They also do use of social media on Twitter, YouTube and Facebook. There is growing engagement with social media, but not enough conversion to volunteering. They also deal with layering of information with researchers, ask for consistent and regular updating on the research and give volunteer control over communication that they receive. Articulating contribution value is to highlight research stories – not just computations and number of volunteers and celebrating and promote scientific success – they lean on networks in IBM to share the word out. The campaign helped in doubling the registration rate to the system. They want to reach more volunteers, and they follow conversion rate – they are missing stories from volunteers and have a volunteer voice, remove barriers to entry, recruitment drive didn’t create. They want to expand research portfolio and want other areas that it can support. 

In the discussion that followed the important of IP, treating volunteers as individuals came up as a topic that worth exploring with volunteer computing project.

wpid-wp-1442566393048.jpgThe next presentation was Science@home –  by Jacob Sherson (University of Aarhus, Denmark). Jacob noted that in citizen science there are different difficulty level and opportunity to user innovation. In Science@home they are trying to extend the range of citizen science involvement with students. They are talking about the creativity research – trying to evaluate creativity with a positivist empirical framework – controlling different variables and evaluating creativity of output according to it. They run scienceathome.org – with 3000 people participating in projects, with experiments ranging from cognitive science, to quantum physics, and business administration – and they have an interdisciplinary team from different areas of research to support the development of the system. An example for the type of project that they deal with is quantum computing – manipulations of electrons – they are sloshing around between states when moving them with laser beams. Using analogies to high school curriculum was useful way to engage participants and make it relevant to their studies. They have discovered that students can understand quantum physics in a phenomenological way through a game interface. They discover that gamers find areas of good region for solutions. The players localised area of the big parameters space – faster than computer simulation. They also studying the formation of strategies in people mind – Quantum Minds. With this programme, they are studying the process of learning the project and mastering it. They looked at the way to people who learn how to solve problems – to see if early performance help to predict the ability to learn the topic. Other games include trying to understand innovations in the Alien Game. They also have behavioural economics game about forming of groups. The educational part is about creativity – thinking of motivations for curriculum and fun with different resources. Game based education is assumed to improve the curriculum and can increase the motivation to learn. The general approach is to provide personalised online learning trajectories – identify types of students and learners and then correlate them and create personalised learning experience. Also want to train researchers to help them explore. 

The next part of the morning session were the 2 Demonstrations starting with EpiCollect – David Aanensen (Imperial College). EpiCollect was created to deal with infectious disease – who, what, where and when – getting the information about genetic make-up of diseases. They realised that there is a generic issue of metadata gathering and the tool evolved into generic forms collection and visualisation tool. The current use of EpiCollect includes a lot of projects in veterinary as GPS monitoring of animals is easier in terms of ethics. It was also used by the Food and Agricultural Organisation (FAO) to monitor the provision of food to communities in different parts of the world. Also used in education projects in Bath university in field courses (building on evolution megalab project to collect information about snails) with students building questionnaire based on the information sheets of the project. They are starting to build longitudinal data. There are projects that link EpiCollect to other systems – such as GeoKey and CartoDB for visualisation.  

Red Wire  was presented by Jesse Himmelstein (University Paris Descartes) -Red Wire is a platform that is aimed at reducing the barrier of creating games for citizen science through a mash-up approach – code and games are open access to encourage reuse. It use functional programming language approach – in a visual programming environment. They are taking metaphors from electronics. There are examples of games that student developed during recent summer schools and other activities. 

CitizenGrid was discussed by John Darlington (Imperial College, London). Citizen Grid is a platform that enables replicating projects on cloud computing, specifically for volunteer computing projects. It can allow unified support to volunteer computing – support for the scientists who are setting a project, but also to the volunteers who want to link to the project. The scientists can map their resources through creation of both client and server virtual machines and register the application. They demonstrated it with projects that also use games – allowing to install the application on local machines or cloud computing.   

wpid-wp-1442502824236.jpgIn the breakout groups, participants discussed the complexity of the platforms and what are the next steps to make them more accessible. For Epicollect, there are challenges of identifying who are the users – they the both the coordinators and the data collectors, and helping them in setting useful project is challenging, especially with the need for usability and user experience expertise. Dealing with usability and user experience is a challenge that is common to such projects. For RedWire, there is a need to help people who do not have any programming experience to develop games, so these are scientists and teachers. Maybe even gemify the game engine with credits to successful game designers who create components that can be remixed. For citizen grid, there is a need for examples of use cases, with currently Virtual Atom Smasher as the main demonstrator.

The afternoon session explored Pilot Projects. CERN@School – Becky Parker (Langton Star Centre) described how she developed, with her students and collaboration with scientists the ability to do science at school. The project is a demonstration how students and teachers can become part of the science community. The project started years ago with students contributing to astrophysics research. The school is involved in fundamental research, with a 17 years old student publishing scientific paper based on theoretical physics research problem that was presented to the students from professional scientists. Her students also put together to put an instrument to detect cosmic rays on the satellite TDS-1. They can see where is their experiment through visualisation over Google Maps that the students developed themselves. Students also created analysis tools for the data. Students can contribute to NASA research on the impact of cosmic rays on International Space Station staff. CERN@School also include experiment in collecting radiation reading which help to map background radiation in the UK (by students at 14-15). Through their work, they discovered that there aren’t many radiation reading in the ocean, and they will do that by mounting a radiation sensor to sea UAV. All this helps students to learn to be scientists. They created the monopole-quest project within the zooniverse projects. It is possible to get young people involved in large scale science projects. It also help to encourage science teachers and to ensure job satisfaction for teachers. The involvement of girls in the project also lead to more participation in science and engineering after school with the school having a disproportionate share of the number of young women who go to study such topics in the UK. 

Rechenkraft.net – From Volunteers to Scientists – Michael Weber (Uni Marburg). Michael describe how volunteers turned to scientists in the area of volunteer computing. Rechenkraft started in 2005 with a forum dedicated to all distributed computing projects around the world, and sharing the information about them among German speaking volunteers. Projects are now being translated to other languages, too. This led to the creation of an organisation, which is now involved in many projects, including climateprediction.net.  volunteers also created monitoring programmes that indicate the process and provide statistics about contributions. They also have yearly face to face gathering of volunteers from across Germany and beyond, with results of creating their own data processing racks and other initiative. Started in electronic sports league but then realised that there are opportunities to assist scientists in developing new projects, and that led to Yoyo@home that will allow the community to help scientists in developing BOINC projects. They regularly participate in conferences and exhibitions to promote the opportunity to other people interested in technology, and they became part of Quake-catcher network. They receive significant press coverage – eventually the city of Marburg (Germany) offered the organisation physical pace that became the Hackspace of the city. Once there is a steady place, they created more sophisticated cluster computers. They also set up the WLAN in the local refugee camp. Finally, they also develop their own scientific project- RNA world which is completely internal project. They encountered problems with very large output files from simulations so they are learning about running distributed computing projects as scientists who use the results and not only as volunteers. They also starting to run different projects about tree health with data recording such as location, photo and plant material.   Similarly, they map protected flowers – all this on volunteer basis. They participate in the effort of developing citizen science strategy 2020 for Germany, and they would like funding to be available to average person so they can participate in projects. There is risk that citizen science will be co-opted by scientists – need to leave space for grass-roots initiatives. There are also barriers for publications. The need for lab results in addition to the simulation encouraged the creation of the wet lab. 

The last short guest talk came from Bernard Revaz who suggested to create Massive Multiplayer Online Science – using game environments like WoW (World of Warcraft) to do science. His aim is inject science into projects such as Eve online – at a given time there are 40,000 users, median age 35, with 50% with degree in science. In Eve online they design an element from the human protein atlas that the gamers will help to classify. The stakeholders in their discussion include scientists,  the gaming company and players and all are very positive about the prospect. In Eve online there are many communities – they are creating a new community of scientists so people join it voluntarily. Working on matching the science tasks to the game narrative and to the game reward system.

After these two guest talks, there were two Demos. 

wpid-wp-1442502761020.jpgFirst, Virtual Atom Smasher (VAS) – Ioannis Charalampidis (CERN) – the VAS is about the way CERN develop the science cycle -observe the situation, lead to theory by theoretical physicists and then carry out experiments to test them. The process includes computer simulations that are explored against experimental data. They are trying to adjust the models until the model reflect the results.VAS evolved from a project by  15 years old student in 2010, who managed to create the best fitting results of a simulation. The VAS is about real cutting edge science, but it is also very challenging and created a game (but don’t use the word game – it’s a simulation). The VAS use CitizenGrid and RedWire for the game and CCL tracker to understand the way people use the platform. The analytics show the impact of training to the desired flow of the game. The VAS combines exploration with opportunities for learning. 

Geotag-X – Eleanor Rusack (UNITAR). This is a platform to crowdsource the analysis of images in humanitarian crises. They usually use satellite imagery to deal with crises, but there are limitations to some images – roofs, clouds etc., and there is a need to know what is going on the ground. The idea is to harvest photos coming from disaster , then analyse them and share the knowledge. A lot of information in photos can be very useful – it’s possible to extract structural information and other details in the image. They got a workflow, who set projects, they then develop the structure of the processing and tutorials, and tools for photo collection tools (from Flickr, Twitter, EpiCollect and Chrome extension). The photos are added to the analysis pool. They have created a project to allow people deal with Yemeni Cultural Heritage at risk as  a result of the way that is happening there. The syste is mostly based on self learning. Geotagging photo is a challenging tasks. It’s a specially an area that need more work. The experts are professionals or academics in specific domain who can help people to design the process, while participants are coming from different backgrounds. They are recruiting people through SciStarter, Mozilla science etc. The keep in touch with online volunteer groups – people who come from SciStarter tend to stay. Digital volunteers also help a lot and they encourage volunteering through presentation, but most important are data sprints. They use evaluation of agreement between analysts – agreement show easy to agree. There is a range of responses to agreement across standard deviation: they identify 3 groups – easy (high  agreement, low standard deviation), mid (high std div and median agreement) and complex (low agreement, low std div). Analysis of images against these agreement level help to improve designs. The want to move the questions up the curve and how to train large number of analysts when project leaders have limited time? 

The follow up discussion explored improvements to VAS – such as integrating arts or linking a BOINC project that will contribute computing resources to the VAS. For Geotag-X, the discussion explored the issue of training – with ideas about involving volunteers in getting the training right, run virtual focus groups or exploring design aspects and collaborations between volunteers.