Eye on Earth (Day 2 – Morning) – moving to data supply

Eye on Earth (Day 2 – Morning) – moving to data supply The second day of Eye on Earth moved from data demand to supply . You can find my posts from day one, with the morning and the afternoon sessions. I have only partial notes on the plenary Data Revolution-data supply side, although I’ve posted separately the slides from my talk. The description of the session stated: The purpose of the the session is to set the tone and direction for the “data supply” theme of the 2nd day of the Summit. The speakers focused on the revolution in data – the logarithmic explosion both in terms of data volume and of data sources. Most importantly, the keynote addresses will highlight the undiscovered potential of these new resources and providers to contribute to informed decision-making about environmental, social and economic challenges faced by politicians, businesses, governments, scientists and ordinary citizens.

The session was moderated by Barbara J. Ryan (GEO) the volume of data that was download in Landsat demonstrate the information revolution. From 53 scene/day to 5700 scene/day once it became open data – demonstrate the power of open. Now there are well over 25 million downloads a year. There is a similar experience in Canada, and there are also new and innovative ways to make the data accessible and useful.

The first talk was from Philemon Mjwara (GEO), the amount of data is growing and there is an increasing demand for Earth Observations, but even in the distilled form of academic publications there is an explosion and it’s impossible to read everything about your field. Therefore we need to use different tools – search engines, article recommendation systems. This is also true for EO data – users need the ability to search, then process and only then they can use the information. This is where GEO come in. It’s about comprehensive, effective and useful information. GEO works with 87 participating organisations. They promote Open Data policies across their membership, as this facilitate creation of a global system of systems (GEOSS). GEOSS is about supply, and through the GEO infrastructure it can be share with many users. We need to remember that the range of sources is varied: from satellite, to aerial imagery, to under-sea rovers. GEO works across the value chain – the producers, value added organisation and the users. An example of this working is in analysis that helps to link information about crops to information about potential vulnerability in food price.

Mary Glackin (the Weather Corporation), reviewed how weather data is making people safer and business smarter. The Weather Company is about the expression of climate in the patterns of weather. Extreme events make people notice. Weather is about what happen in the 100 km above the Earth surface, but also the 3.6 km average depth of the oceans, which we don’t properly observe yet and have an impact on weather. There are 3 Challenges: keep people safe, helping businesses by forecasting, and engage with decision makers. Measuring the atmosphere and the oceans is done by many bodies which go beyond official bodies – now it includes universities, companies, but also citizens observations which is done across the world (through Weather Underground). The participants, in return, receive a localised forecast for their area and details of nearby observations. It’s a very large citizen science project, and engagement with citizen scientists is part of their work. Forecasting require complex computer modelling – and they produce 11 Billion forecasts a day. Engaging decision makers can be individual fisherman who need to decide if to go out to sea or not. There is a need for authoritative voice that create trust when there are critical issues such as response to extreme events. Another example is the use of information about turbulence from airplanes which are then used to improve modelling and provide up to date information to airlines to decide on routes and operations. Technology is changing – for example, smartphones now produce air pressure data and other sensing abilities that can be used for better modelling. There are policies that are required to enable data sharing. While partnerships between government and private sector companies. A good example is NOAA agreeing to share all their data with cloud providers (Microsoft, Amazon, Google) on the condition that the raw data will be available to anyone to download free of charge, but the providers are free to create value added services on top of the data.

Next was my talk, for which a summary and slide are available in a separate post.

Chris Tucker (MapStory) suggested that it is possible to empower policy makers with open data. MapStory is an atlas of changes that anyone can edit, as can be seen in the development of a city, or the way enumeration district evolved over time. The system is about maps, although the motivation to overlay information and collect it can be genealogy – for example to be able to identify historical district names. History is a good driver to understand the world, for example maps that show the colonisation of Africa. The information can be administrative boundaries, imagery or environmental information. He sees MapStory as a community. Why should policy makers care? they should because ‘change is the only constant’, and history help us in understanding how we got here, and think about directions for the future. Policy need to rely on data that is coming from multiple sources – governmental sources, NGOs, or citizens’ data. There is a need for a place to hold such information and weave stories from it. Stories are a good way to work out the decisions that we need to make, and also allow ordinary citizens to give their interpretation on information. In a way, we are empowering people to tell story.

The final talk was from Mae Jemison (MD and former astronaut). She grow up during a period of radical innovations, both socially and scientifically – civil rights, new forms or dance, visions of a promising future in Start Trek, and the Apollo missions. These have led her to get to space in a Shuttle mission in 1992, during which she was most of the time busy with experiments, but from time to time looked out of the window, to see the tiny sliver of atmosphere around the Earth, within which whole life exist. Importantly, the planet doesn’t need protection – the question is: will humans be in the future of the planet? Every generation got a mission, and ours is to see us linked to the totality of Earth – life, plants and even minerals. Even if we create a way to travel through space, the vast majority of us will not get off this planet. So the question is: how do we get to the extraordinary? This lead us to look at data, and we need to be aware that while there is a lot of it, it doesn’t necessarily mean information, and information doesn’t mean wisdom. She note that in medical studies data (from test with patients) have characteristics of specificity (relevant to the issue at hand) and sensitivity (can it measure what we want to measure?). We tend to value and act upon what we can measure, but we need to consider if we are doing it right. Compelling data cause us to pay attention, and can lead to action. Data connect us across time and understanding a universe grater that ourselves, as the pictures from Hubble telescope that show the formation of stars do. These issues are coming together in her current initiative “100 years starship” – if we aim to have an interstellar ship built within the next 100 years, we will have to think about sustainability, life support and ecosystems in a way that will help us solve problems here on Earth. It is about how to have an inclusive journey to make transformation on Earth. She completed her talk by linking art, music and visualisation with the work of Bella Gaia

After the plenary, the session Data for Sustainable Development was building on the themes from the plenary. Some of the talks in the session were:

Louis Liebenberg presented cybertracker – showing how it evolved from early staged in the mid 1990s to a use across the world. The business model of cybertracker is such that people can download it for free, but it mostly used off-line in many places, with majority of the users that use it as local tool. This raise issues of data sharing – data doesn’t go beyond that the people who manage the project. Cybertracker address the need to to extend citizen science activities to a whole range of participants beyond the affluent population that usually participate in nature observations.

Gary Lawrence – discussed how with Big Data we can engage the public in deciding which problem need to be resolved – not only the technical or the scientific community. Ideas will emerge within Big Data that might be coincident or causality. Many cases are coincidental. The framing should be: who are we today? what are we trying to become? What has to be different two, five, ten years from now if we’re going to achieve it? most organisations don’t even know where they are today. There is also an issue – Big Data: is it driven by a future that people want. There are good examples of using big data in cities context that take into account the need of all groups – government, business and citizens in Helsinki and other places.

B – the Big Data in ESPA experience www.espa.ac.uk – data don’t have value until they are used. International interdisciplinary science for ecosystems services for poverty alleviation programme. Look at opportunities, then the challenges. Opportunities: SDGs are articulation of a demand to deliver benefits to societal need for new data led solution for sustainable development, with new technologies: remote sensing / UAVs, existing data sets, citizen science and mobile telephony, combined with open access to data and web-based applications. Citizen Science is also about empowering communities with access to data. We need to take commitments to take data and use it to transforming life.

Discussion: lots of people are sitting on a lots of valuable data that are considered as private and are not shared. Commitment to open data should be to help in how to solve problems in making data accessible and ensure that it is shared. We need to make projects aware that the data will be archived and have procedures in place, and also need staff and repositories. Issue is how to engage private sector actors in data sharing. In work with indigenous communities, Louis noted that the most valuable thing is that the data can be used to transfer information to future generations and explain how things are done.

Eye on Earth Summit 2015 talk – Extreme Citizen Science – bridging local & global

Thanks to the organisers of the Eye on Earth Summit, I had an opportunity to share the current state of technological developments within the Extreme Citizen Science (ExCiteS) group with the audience of the summit: people who are interested in the way environmental information sharing can promote sustainability.

The talk, for which the slides are provided below is made of two parts. The first is an overview of current citizen science and where are the extremities of current practice, and the second covering the current state of development of the technological work that crease the tools, methodologies and techniques to allow any community, regardless of literacy, to develop their own citizen science projects.

I have addressed the issues at the beginning of the talk in earlier talks (e.g. the UCL Lunch Hour Lecture) but now found a way to express them in several brief slides which demonstrate the changes in science and education levels in the general population as an important trends that powers current citizen science. If we look at early science (roughly until the early 19th Century), professional science (roughly from the middle of the 19th Century all the way throughout the 20th Century) and the opening of science in the past decade, we can see an ongoing increase in the level of education in the general population, and this leads to different types of participation in citizen science – you couldn’t expect more than methodological basic data collection  by volunteers in the early 20th Century, while today you can find many people who have good grasp of scientific principles and are inherently sharing data that they are interested in.

After exploring the limits of current citizen science in terms of the scientific process and levels of education that are expected from participants, I turn to our definition of extreme citizen science, and then focus on the need to create technologies that are fit for use within participatory processes that take into account local and cultural sensitivities, needs and wishes about the use of the data. In particular, I’m explaining the role of Sapelli and its use with participatory processes in the Congo basin, Amazon and potentially in Namibia. I then explain the role of GeoKey in providing an infrastructure that can support community mapping, ending with the potential of creating visualisation tools that can be used by non-literate participants.

The slides are available below.

Eye on Earth (Day 1 – afternoon) – policy making demand for data and knowledge for healthy living

The afternoon of the first day of Eye on Earth (see previous post for an opening ceremony and the morning sessions) had multiple tracks. I selected to attend Addressing policy making demand for data; dialogue between decision makers and providers

wpid-wp-1444139631192.jpgThe speakers were asked to address four points that address issues of data quality control and assurance, identify the major challenges facing data quality for decision-making in the context of crowd-sourcing and citizen science. Felix Dodds  who chaired the session noted that – the process of deciding on indicators for SDGs is managed through the UN Inter-agency group, and these indicators and standards of measurements need to last for 15 years.  There is now also ‘World Forum on Sustainable Development Data’ and review of the World Summit on Information Society (WSIS) is also coming. The speakers are asked to think about  coordination mechanisms and QA to ensure good quality data? How accessible is the data? Finally, what is the role of citizen science within this government information? We need to address the requirements of the data – at international, regional, and national levels.

Nawal Alhosany (MASDAR institute): Data is very important ingredient in making policy when you try to make policy on facts and hard evidence. Masdar is active throughout the sustainability chain, with a focus on energy. The question how to ensure that data is of good quality, and Masdar recognised gap in availability of data 10 years ago. For example, some prediction tools for solar power were not taking into account local conditions, as well as quality assurance that is suitable to local needed. Therefore, they developed local measurement and modelling tools (ReCREMA). In terms of capacity building, they see issues in human capacity across the region, and try to address it (e.g. lack of open source culture). In Masdar, they see a role for citizen science – and they make steps towards it through STEM initiatives such as Young Future Energy Leaders and other activities.

David Rhind (Nuffiled Foundation): many of the data sets that we want cover national boundaries – e.g. radioactive plum from Chernobyl. When we want to mix population and environment, we need to deal with mixing boundaries and complex problems with data integrity. There are also serious problem with validity – there are 21 sub-Saharan countries that haven’t done household survey sine 2006, so how can we know about levels of poverty today? There is a fundamental question of what is quality, and how can we define it in any meaningful sense. Mixing data from different sources is creating a problem of what quality mean. Some cases can rely on international agreements – e.g. N principles, or the UK regulatory authority to check statistics. Maybe we should think of international standards like in accountancy. In terms of gaps in capacity, there is a quick change due to need for analysis and data scientists are becoming available in the UK, but there is issue with policy makers who do not have the skills to understand the information. Accessible data is becoming common with the open data approach, but many countries make official data less open for security. However, data need some characteristics – need to be re-use , easy to distribute, public and with open licensing. The issue about the citizen science – there are reasons to see it as an opportunity – e.g. OpenStreetMap, but there are many factors that make its integration challenging. There is a need for proper communication – e.g. the miscommunication in L’Aquila

Kathrine Brekke (ICLEI) – perspective from local government. Local government need data for decision-making. Data also make it the city suitable for investment, insurance, and improve transparency and accountability. There are issues of capacity in terms of collecting the data, sharing it, and it is even down to language skills (if it is not available in English, international comparison is difficult). There are initiatives such as open.dataforcities.org to allow sharing of city data. There are 100 sustainability indicators that are common across cities and can be shared. In terms of data quality we can also include crowdsourcing – but then need to ensure that it the data will be systematic and comparable. The standards and consistency are key – e.g. greenhouse registry is important and therefore there is global protocol for collecting the data.

Ingrid Dillo (DANS, Netherlands) there is data deluge with a lot of potential, but there are challenges about the quality of the data and trust. Quality is about fitness for use. DANS aim is to ensure archiving of data from research projects in the Netherlands. Data quality in science – made of scientific data quality but also technical. Scientific integrity is about the values of science – standards of conduct within science. There are issues with fraud in science that require better conduct. Data management in small projects lack checks and balances, with peer pressure as major driver to ensure quality – so open science is one way to deal with that. There are also technical issues such as metadata and data management so it can be used and stored in certified trustworthy digital repository.

Robert Gurney (University of Reading) -in environmental science there is the Belmont Forum e-Infrastructures & data management. The Belmont forum is association of environmental science funders from across the world. The initiative is to deal with the huge increase in data. Scientists are early adopters of technology and some of the lessons can be used from what scientists are doing by other people in the environmental sector. The aim is to deliver knowledge that is needed for action. The infrastructure is needed to meet global environmental challenges. This require working with many supercomputers – the problems are volume, variety, veracity, velocity (Big Data) – we getting many petabytes – can reach 100 Petabytes by 2020. The problem is that data is in deep silos – even between Earth Observation archives. The need to make data open and sharable. There will be 10% of funding going towards e-infrastructure. They created data principles and want to have the principle of open by default.

Marcos Silva (CITES)  Cites is about the trade in engendered species . CITES (since mid 1970s)  regulate trade in multi-billion dollar business with 850,000 permits a year. Each permits say that it’s OK to export a specimen without harming the population. It is data driven. CITES data can help understanding outliers and noticing trends. There are issues of ontologies, schema, quality etc. between the signatories – similar to environmental information. They would like to track what happen to the species across the world. They are thinking about a standard about all the transactions with specimen which will create huge amount of data. Even dealing with illegal poaching and protection of animals, there is a need for interoperable data.

Discussion: Data Shift for citizen generated data for SDG goals. Is there data that is already used? How we are going to integrate data against other types of data? We risk filtering citizen science data out because it follow different framework. Rhind – statisticians are concerned about citizen science data, and will take traditional view, and not use the data. There is a need to have quality assurance not just at the end. The management of indicators and their standards will require inclusion of suitable data. Marcos ask what is considered citizen science data? e.g. reporting of data by citizens is used in CITES and there are things to learn – how the quality of the data can be integrated with traditional process that enforcement agencies use. Science is not just data collection and analysis, such as climateprediction.net  and multiple people can analyse information. Katherine talked about crowdsourcing – e.g. reporting of trees in certain cities  so there is also dialogue of deciding which trees to plant. Ingrid – disagree that data collection on its own is not science. Nawal – doing projects with schools about energy, which open participation in science. Rhind – raised the issue of the need for huge data repository and the question if governments are ready to invest. Gurney – need to coordinate multiple groups and organisations that are dealing with data organisations. There is a huge shortage of people in environmental science with advanced computing skills.

wpid-wp-1444166132788.jpgThe second session that I attended explored Building knowledge for healthy lives opened by Jacqueline McGlade – the context of data need to focus on the SDGs, and health is underpinning more goals then environmental issues. UNEP Live is aimed to allow access UN data – from country data, to citizen science data – so it can be found. The panel will explore many relations to health: climate change, and its impact on people’s life and health. heatwaves and issues of vulnerability to extreme events. Over 40 countries want to use the new air quality monitoring that UNEP developed, including the community in Kibera.

wpid-wp-1444166114783.jpgHayat Sindi is the CEO of i2Institute, exploring social innovations. Our ignorance about the world is profound. We are teaching children about foundation theories without questioning science heroes and theories, as if things are static. We are elevating ideas from the past and don’t question them. We ignore the evidence. The fuel for science is observation. We need to continue and create technology to improve life. Social innovation is important – and she learn it from diagnostic for all (DFA) from MIT. The DFA is low cost, portable, easy to use and safely disposable. The full potential of social innovation is not fulfilled. True scientists need to talk with people, understand their need, and work with them

Maria Neira (WHO) – all the SDGs are linked to health. A core question is what are the environmental determinants of health. Climate change, air quality – all these are part of addressing health and wellbeing. Need to provide evidence based guidelines, and the WHO also promote health impact assessment for major development projects. There are different sectors – housing, access to water, electricity – some healthcare facility lack access to reliable source of energy. Air pollution is a major issue that the WHO recognise as a challenge – killing 7m people a year. With air quality we don’t have a choice with a warning like we do with tobacco. The WHO offering indicators who offer that the access to energy require to measure exposure to air pollution. There is a call for strong collaboration with other organisation. There is a global platform on air quality and health that is being developed. Aim to enhance estimation of the impacts from air quality.

Joni Seager (GGEO coordinating lead author) talking about gender and global environmental outlook. She looks at how gender is not included in health and environmental data. First example – collect gender data and then hide it. Gender analysis can provide better information can help in decision making and policy formation.  Second method – dealing with households – they don’t have agency in education, access to car or food security, but in reality there is no evidence that food security is household level attribute – men and women have different experience of coping strategies – significant different between men or women. Household data is the view of the men and not the real information. Household data make women especially invisible. There are also cases where data is not collected. In some areas – e.g. sanitation, information is not collected. If we building knowledge for healthy life, we should ask who’s knowledge and who’s life?

Parrys Raines (Climate Girl) grown in Australia and want to protect the environment – heard about climate change as 6 years old and then seek to research and learn about the data – information is not accessible to young girls. She built close relationships with UNEP. There are different impacts on young people. She is also sharing information about air quality and pollution to allow people to include youth in the discussion and solutions. Youth need to be seen as a resource across different levels – sharing generation, global thinking. There is need for intergenerational communication – critical. knowledge of data is critical for the 21st century. Need organisations to go out and support youth – from mentoring to monetary support.

wpid-wp-1444166106561.jpgIman Nuwayhid talking about the health and ecological sustainability in the Arab world. There are many Millennium Development Goals MDGs that have been achieved, but most of the countries fell short of achieving them. In ecological sustainability, the picture is gloomy in the Arab world – many countries don’t have access to water. Demand for food is beyond the capacity of the region to produce. Population is expected to double in next 30 years. Poorer countries have high fertility – lots of displacement: war, economic and environmental. Development – there are striking inequities in the region – some of the wealthiest countries and the poorest countries in the world. Distribution of water need to consider which sector should use it. In comparison of health vs military expenditure, the Arab world spend much more on military than on health. There is interaction between environment, population and development. The region ecological footprint is highest and increasing. There are also issues of political instability that can be caused by environmental stresses. Displacement of people between countries create new stresses and question the value of state based analysis. Uncertainty is a major context for the region and science in general.

Discussion: the air quality issue – monitoring is not enough without understanding the toxicity, dispersion. Air pollution are impacted also by activities such as stone quarries. Need to balance monitoring efforts with accuracy and the costs of acting. Need to develop models and methods to think about it’s use. Some urban area of light and noise have also impacts not just on death but on quality of life and mental problems.

Two side events of interest run in parallel:

wpid-wp-1444166098477.jpgThe European Environmental Bureau presented a side event on collaborative research and activist knowledge on environmental justice. Pressure on resources mean extractive industries operate in the south with the outcomes used in the North. There is an increased level of conflicts in the south. The EJOLT project is a network of 23 partners in 23 countries. It’s collaborative research of scientists, grass roots organisations, NGOs and legal organisations. They had a whole set results. A visible result is the Atlas of environmental justice. There is plenty to say about citizen science and how important is that information come from people who are closed to the ground. They work with team in ecological economics, that created a moderated process for collecting and sharing information. The atlas allow to look at information according to different categories, and this is link to stories about the conflict and it’s history – as well as further details about it. The atlas is a tool to map conflicts but also to try and resolve them. The EEB see the atlas as an ongoing work and they want to continue and develop sources of information and reporting. Updating and maintaining the tool is a challenge that the organisation face.

At the same time, the best practice guidelines Putting Principle 10 into action was launched, building on the experience from Aarhus guide, there are plenty of case studies and information and it will be available at on the UNEP website

wpid-wp-1444166160281.jpgThe gala dinner included an award to the sensable city lab project in Singapore, demonstrating the development of personalise travel plans that can help avoiding pollution and based on 30-40 participants who collected data using cheap sensors.

Eye on Earth (day 1 – morning) – opening and the need for data

wpid-wp-1444123666530.jpgFour years after the first Eye on Earth Summit (see my reflections about the 2011 event here, and the Dublin meeting in 2013 here), the second summit is being held in Abu Dhabi. Eye on Earth is a meeting that is dedicated to the coordination of environmental information sharing at all scales so it can be used for decision making.
The 2015 summit is structured around 3 core themes, with each day focusing on one aspect: data demand, data supply and enabling condition. By its nature, environmental information is geographical, so the meeting include people from different aspects of geographical information production and management – from satellites and remote sensed data to citizen science.
The first day stated with an opening ceremony with a statement that on Earth, people and nature are linked together, for example the link between the Sahara and the Amazon through dust that transfer nutrients. We came to know that through information that is not only coming from big organisations like NASA, but there are many citizen scientists that also report what happen to the dust that does not travel all the way. Integrating all these bits of information bring with it questions about ownership, how it is used and who use it – all these are questions that we are explored in Eye on Earth.
The opening video was conveying messages about the importance of looking after the planet, and noticing the connection between elements of nature. The stresses that it is currently experiencing, and the potential of information and information sharing to make better decisions. Sharing information about society, about one another – “and the earth itself”. Eye on Earth is a network of networks.
H.E. Razan Khalifa Al Mubarak opened on behalf of AGEDI, the Abu Dhabi Global Environmental Data Initiative. She provided the context of the development of Eye on Earth from the early stages, and the special initiatives that are done as part of it. AGEDI have been running for 13 years, with a range of initiatives to support  environmental information locally and globally. (I only partially summarised H.E. Razan Khalifa Al Mubarak and nothing from H.E. Anwar Gargas because of lack of access to translation).
wpid-wp-1444128314186.jpgAchim Steiner, who head the UNEP explained that Eye on Earth is not just a group of environmental managers and scientists are focusing on – it takes back to the blue marble, which demonstrate the uniqueness and fragility of the planet. The state of the planet in 2015 is worse than 40 years ago, when Apollo 17 took the image – from the atmosphere to biodiversity – the balance-sheet of the planet is pointing in the wrong directions. We have billions of people added to the planet, and there was extraordinary progress – but a deeper sense of discomfort about the responsibility to nature is lacking. We now have millions of way to looking at the planet and understand what is happening. The agenda for sustainable development goals (SDGs) have been adopted few days ago in the UN – and it is now understood that environmental issues cannot be seen as separate from development, and we now have universal goals to create environmental knowledge and expertise with the data that become available. UNEP Live is an attempt to create an open data network and is linked with AGEDI initiatives. The world is frustrated – we can’t describe problems in abstract. Data management is crucial to develop systemic solutions – we live in ecosystems, but also social and economic systems. We can’t talk about a world of 10 billion people without transition to low carbon solutions. We need to deal with equality and justice. There are new markets in pay as you go for off grid solar energy, and range of solutions that will guide us to the future. UNEP shared a video about the pressures that we are experiencing (lose of species, climate change) and the need to act now for people and planet, calling for people to join the discussion at myunea.org
H.E. Rashid Ahmed Mohammed Bin Fahad. – who is the minister responsible for environment and water at UAE. Highlighting the long leadership in Abu Dhabi to environmental issues since its foundations.  Data are very important to evaluate the progress that was achieved since the previous summit, and to understand the progress. Data and environmental information are critical to the UAE, especially due to all the development in the area. The national agenda for 2021 is also aiming to have different ways of accessing and using data. They have the ecoprint in 2007 – partnerships of sharing information and system to achieve for lighting, energy and water. Ecoprint help in reducing the environmental impact in the UAE. The UAE aim to turn the economy green in the coming years, and this is also important for competitiveness. We need to acknowledge the importance of data and information, and we need to bridge the gap between developing and developed country, and we need to have accurate data.
Following the opening, the first plenary session focused on Data Demand which “provide an overview of the key political and societal agreements and ambitions for a transition to a sustainable future. Highlighting the opportunities and challenges we face regarding data, information and knowledge. An increased evidence and knowledge base is required to support policy and decision makers in delivering on these commitments and in tracking progress. Never before has the world had the need for – and access to – so much data and information enhanced through rapidly evolving technologies and multi-stakeholder engagement. Achieving sustainable development is not possible without all of society playing its role. This requires leadership, partnership and accountability from the UN, governments, the private sector and civil society.”
The first presenter was H.E. Mohammed Al Ahbabi who covered UAE space activities. Space is important – for national security and economy, and new race for space with over 60 countries participating. Space capabilities are important for environmental monitoring – earth observations, and this week is the space week. UAE identified space as important activities long time ago – $5.5B investment, from telecommunication to earth observation. UAE set a space agency a year ago, coordinating activities – building capacity and regulating the sector. It also lead on a space mission to Mars – science mission to explore the atmosphere to understand things on Earth. Aiming to launch it in 2020 so it arrive to Mars in 2021 to celebrate 50 years to UAE. UAE aim to have 12 satellite, with valuable information to help protect the environment.
Thani Al Zayoudi – UAE representative to the international renewable energy agency, in the ministry of foreign affairs. Eye on Earth is about the role of UAE in being part of providing data and using it. UAE welcome the SDGs, and they require sharing data at many levels. This is also required for COP21. The UAE is engaged in a process of creating world class national data. They also aim to coordinate environmental data, a full accounting of carbon emissions and more – they aim is to have data to know where they want to go. They have KPI for the UAE which include many environmental goals for 2021.
Naoko Ishii, CEO of Global Environment Facility – who was involved in setting SDG. SDGs recognise that ambitions to developments are limited by planetary boundaries and the need to protect the environment. We need multi stakeholders engagement to address issues. Finally, there is plenty need to access data. We are going to have a special period, that we can get information about the earth from satellite, social media, sensors and many sources. Yet, for those who work in developing countries, there is a gap in capabilities in many governments and communities. The data enhanced GEF projects – knowing more lead for better policy. High resolution data is helpful to planning from disaster preparedness to climate adaptation. There is also more marine data. Reliable timely information can lead to better enforcement of agreed goals and target, as demonstrated in Global Forest Watch. We need to make sure that capacity is build from local to national levels to allow them to use the data. GEF are paying extra attention to augment capabilities – but the challenge is massive and need to address it. GEF aim to improve knowledge management as a goal. Second point: need to promote integrated approaches to help making change. Need analysis to understand how such integrated approaches can become part of policy making. Need multi-stakeholders partnerships – they are enhanced by multiple sources of information. For example, linking commodities flow to forest monitoring. Better informed government, business and citizens can make better decisions that benefit them and the environment. The commitment to SDG can help make it happen.

Mathis Wackernagel (Global Footprint Network) – with resource consumption in China, India and US, what should small countries do? If we assume that resource demand will continue to go on forever? You need to prepare you country to the future. Imagine a boat with a hole, it does not make sense to wait with all the other boat owners to fix their boats. He explained how to calculate ecological footprint, using a global hactre. Over the last 20 years, most of humanity is living in countries that passed their ecological footprint. Every country has it’s own characteristics and aspects. There are differences in footprint and biocapacity. Countries are working with GFN to take efficiency down. They also work with financial institutions. Countries should move to action regardless of the global level of actions. Understand the country resource situation, including trends. Also need to assess trading partners and how they perform. Which product lines will be need more and which less.

Robbie Schingler (planet Labs) – we’re in a sensor revolution, there are new entrants every day, and we need to join forces. going through sensors revolutions – mobile phones, or drones and also projects such as OpenROV for marine environment. Getting to the point ‘transparent planet’ and we can use all this information to understand the world in real time. Space also change: consumer electronics, advances in manufacturing, but access to space is still limited. Planet labs mission is to build large set of satellite and image the whole day every day and provide universal access to the data. They were in a garage when Eye on Earth 2011 happen and that will continue to happen – new people will join in all the time. They create a satellites that is tiny compared to Landsat – and possible to put more in place. They aim to have 100 satellites are in line so they can scan the earth – they have already several dozes in space. Already starting to show changes in places and this can fit into many of the SDGs – about 15 of them. They see their work as part of Global Partnership for Sustainable Development Data – data4sdgs.org and there are many organisations that are having a mission at their core. Collaboration and sharing is critical to make it happen.

Pierre-Yves Cousteau (Cousteau) –  talked about the family legacy and the inventiveness of his effort over the years. He highlight the importance of ecosystems services, but there is also aesthetic value in swimming among different creatures. There are many ecosystems services – catching carbon, producing food. The oceans produce huge economic value – there are problems with plastic distribution across the ocean – when we eat fish, we are consuming this plastic. There are also problems from climate change – risks to reefs and corals – and they have ecosystems that are based on them. There are many risks – COP21 does not discuss the ocean enough and that is an issue. The Cousteau divers is a citizen science initiative that use recreational divers to provide information about it. There are marine protected area that help the ocean recover. This also open opportunity to invest in nature. Project Hermes is a project to take the temperature of the ocean properly – not enough information is recorded from satellite. We can get the data from dive computers that give both historical and future information. They secured over 100,000 logs that will be shared in open way.

Interestingly, in this first set of talks, Citizen science was recognised already in three.  “citizen participation and citizen science is the key” was the message that closed the session.

Citizen Cyberlab Summit (day 2)

DSCN1165The second day of the Citizen Cyberlab Summit followed the same pattern of the first day: Two half day sessions, in each one short presentations from guest speakers from outside the project consortium, followed by two demonstrations of specific platform, tool, pilot or learning, and ending with discussion in groups, which were then shared back.

The first session started with History of Citizen Sciences – Bruno Strasser (Uni Geneva) – looking at both practical citizen science and the way it is integrated into the history of science. The Bioscope is a place in Geneva that allowing different public facing activities in the medical and life science: biodiversity, genetic research etc. They are developing new ways of doing microscopy – a microscope which is sharing the imagery with the whole room so it is seen on devices and on turning the microscope from solitary experience to shared one. They are involved in biodiversity research that is aimed to bar-coding DNA of different insects and animals. People collect data, extract DNA and sequence it, and then share it in a national database. Another device that they are using is a simple add-on that turns a smartphone can be turned into powerful macro camera, so children can share images on instagram with bioscope hashtag. They also do ‘Sushi night’ where they tell people what fish you ate if at all…
This link to a European Research Council (ERC) project  – the rise of citizen sciences – on the history of the movement. Is there something like ‘citizen sciences’? From history of science perspective, in the early 20c the amateur scientist is passing and professionals are replacing it. He use a definition of citizen science as amateurs producing scientific knowledge – he is not interested in doing science without the production of knowledge. He noted that there are a lot of names that are used in citizen science research. In particular, the project focus is on experimental sciences – and that because of the laboratory revolution of the 1930s which dominated the 20th century. The lab science created the divide between the sciences and the public (Frankenstein as a pivotal imagery is relevant here). Science popularisation was trying to bridge the gap to the public, but the rise in experimental sciences was coupled with decline of public participation. His classification looks at DIYbio to volunteer computing – identifying observers, analysers etc. and how they become authors of scientific papers. Citizen science is taken by the shift in science policy to science with and for society. Interest in the promises that are attached to it: scientific, educational (learning more about science) and political (more democratic). It’s interesting because it’s an answer to ‘big data’, to the contract of science and society, expertise, participation and democratisation. The difference is demonstrated in the French response following Chernobyl in 1986, with presentation by a leading scientists in France that the particle will stop at the border of France, compared that to Deep Horizon in 2010 with participatory mapping through public lab activities that ‘tell a different story’. In the project, there are 4 core research question: how citizen science transform the relationship between science and society? who are the participants in the ‘citizen sciences’ – we have some demographic data, but no big picture – collective biography of people who are involved in it. Next, what is the ‘moral economies’ that sustain the citizen sciences? such as the give and take that people get out of project and what they want. Motivations and rewards. Finally, how do citizen sciences impact the production of knowledge? What is possible and what is not. He plan to use approaches from digital humanities process. He will build up the database about the area of citizen science, and look at Europe, US and Asia. He is considering how to run it as participatory project. Issues of moral economies are demonstrated in the BOINC use in commercial project. 

Lifelong learning & DIY AFM – En-Te Hwu (Edwin) from Academia Sinica, Taiwan). There are different ways of doing microscopy at different scales – in the past 100 years, we have the concept of seeing is believing, but what about things that we can’t see because of the focused light of the microscope – e.g. under 1 micron. This is possible with scanning electron microscope which costs 500K to 2M USD, and can use only conductive samples, which require manipulation of the sample. The Atomic Force Microscope (AFM) is more affordable 50K to 500K USD but still out of reach to many. This can be used to examine nanofeatures – e.g. carbon nanotubes – we are starting to have higher time and spatial resolution with the more advanced systems. Since 2013, the LEGO2NANO project started – using the DVD head to monitor the prob and other parts to make the AFM affordable. They put an instructable prototype that was mentioned by the press and they called it DIY AFM. They created an augmented reality tool to guide people how to put the device together, and it can be assembled by early high school students – moving from the clean room to the class room.  The tool is being used to look at leafs, CDs – area of 8×8 microns and more. The AFM data can be used with 3D printing – they run a summer school in 2015 and now they have a link to LEGO foundation. They are going through a process of reinventing the DIY AFM, because of patenting and intellectual property rights (IPR) – there is a need to rethink how to do it. They started to rethink the scanner, the control and other parts. They share the development process (using building process platform of MIT media lab). There is a specific application of using the AFM for measuring air pollution at PM2.5. using a DVD – exposing the DVD by removing the protection layer, exposing it for a period of time and then bringing it and measuring the results. They combined the measurements to crowdcrafting for analysis. The concept behind the AFM is done by using LEGO parts, and scanning the Lego points as a demonstration, so students can understand the process. 

wpid-wp-1442566370890.jpgThe morning session included two demonstrations. First, Creativity in Citizen Cyberscience – Charlene Jennett  (UCLIC, UCL) – Charlene is interested in psychological aspects of HCI. Creativity is a challenge in the field of psychology. Different ideas of what is creativity – one view is that it’s about eureka moment as demonstrated in Foldit breakthrough. However, an alternative is to notice everyday creativity of doing thing that are different, or not thought off original. In cyberlab, we are looking at different projects that use technologies and different context. In the first year, the team run interviews with BOINC, Eyewire, transcribe Bentham, Bat Detective, Zooniverse and Mapping for Change – a wide range of citizen science projects. They found many examples  – volunteers drawing pictures of the ships that they were transcribing in Old Weather, or identifying the Green Peas in Galaxy zoo which was a new type of galaxy. There are also creation of chatbots about their work -e.g. in EyeWire to answer questions, visualisation of information, creating dictionaries and further information. The finding showed that the link was about motivation leading to creativity to help the community or the project. They created the model of linking motivation, learning through participation, and volunteer identity that lead to creativity. The tips for projects include: feedback on project progress at individual and project level, having regular communication – forum and social media, community events – e.g. competitions in BOINC, and role management – if you can see someone is doing well, then encourage them to take more responsibility. The looked at the different pilots of Cyberlab – GeoTag-X, Virtual Atom Smasher, Synthetic Biology through iGEM and Extreme Citizen Science. They interview 100 volunteers. Preliminary results – in GeoTag-X, the design of the app is seen as the creative part, while for the analysts there are some of the harder tasks – e.g. the georeferencing of images and sharing techniques which lead to creative solutions. In the iGEM case they’ve seen people develop games and video. in the ExCiteS cases, there is DIY and writing of blog posts and participants being expressive about their own work. There are examples of people creating t-Shirt, or creating maps that are appropriate for their needs.They are asking questions about other projects and how to design for creativity. It is interesting to compare the results of the project to the definition of creativity in the original call for the project. The cyberlab project is opening up questions about creativity more than answering them. 

wpid-wp-1442679548581.jpgPreliminary Results from creativity and learning survey – Laure Kloetzer (university of Geneva). One of the aims of Citizen Cyberlab was to look at different aspects of creativity. The project provided a lot of information from a questionnaire about learning and creativity in citizen science. The general design of the questionnaire was to learn the learning outcomes. Need to remember that out of the whole population, small group participate in citizen science – and within each project, there is a tiny group of people that do most of the work (down to 16 in Transcribed Bentham) and the question of how people turn from the majority, who do very little work to highly active participants is unknown, yet. In Citizen Cyberlab we carried out interviews with participants in citizen science projects, which led to a typology of learning outcomes – which are lot wider than those that are usually expected or discussed in the literature – but they didn’t understand what people actually learn. The hypothesis is that people who engage with the community can learn more than those that doesn’t – the final questionnaire of the project try to quantify learning outcomes (informal learning in citizen science – ILICS survey). The questionnaire was tested in partial pilot. Sent to people in volunteer computing, volunteer thinking and others types. They had about 700 responses, and the analysis only started. Results – age group of participants is diverse from 20-70, but need to analyse it further according to projects. Gender – 2/3 male, third female, and 20% of people just have high school level of education, with 40% with master degree or more – large minority of people have university degree. They got people from 64 countries – US, UK, Germany and France are the main ones (the survey was translated to French). Science is important to most, and a passion for half, and integrated in their profession (25% of participants). Time per week – third of people spend less than 1 hour, and 70% spend 1-5 hours – so the questionnaire captured mostly active people. Results on learning – explore feeling, what people learn, how they learn and confidence (based on the typology from previous stages of the project). The results show that – people who say that they learn something to a lot, and most people accept that they learn on-topic knowledge (about the domain itself – 88%), scientific skills (80%), technological skills (61%), technical skills (58%), with political, collaboration skills and communication skills in about 50% of the cases. The how question – people learn most from project documentation (75%) but also by external resources (70%). Regarding social engagement, about 11% take part in the community, and for 61% it’s the first time in their life that they took such a role. There are different roles – translation, moderating forums with other things in the community that were not recognised in the questionnaire. 25% said that they met people online to share scientific interests – opportunity to share and meet new people. Learning dimensions and types of learners – some people feel that they learn quite a lot about various things, while others focus on specific types of learning. wpid-wp-1442679528037.jpgPrincipal Component Analysis show that learner types correlate with different forms of engagement – more time spent correlate to specific type of learner. There are different dimensions of learning that are not necessarily correlate. The cluster analysis show about 10 groups – people who learn a lot on-topic and about science with increase self-confidence. Second group learn on topic but not much confidence. Group 3, like 2 but less perception of learning. Group 4 don’t seem to learn much but prefer looking at resources. 5 learn somewhat esp about computers. 6 learn through other means. 7 learn by writing and communicating, collaborating and some science. 8 learn only about tools, but have general feeling of learning. 9 learn on topic but not transferable and 10 learn a lot on collaboration and communication – need to work more on this, but these are showing the results and the raw data will be shared in December. 

DSCN1160Following the presentation, the group discussion first explored examples of creativity from a range of projects. In crowdcrafting, when people are not active for a month, they get email with telling them that they will be deleted – one participant created activities that link to the project – e.g. tweeting from a transcriptions from WW I exactly 100 years after it happen. In Cornell Lab of Ornithology, volunteers suggest new protocols and tasks about the project – new ways of modifying things. In the games of ScienceatHome are targeted specifically to explore when problem solving become creative – using the tools and explaining to the researchers how they solve issues. In WCG one volunteered that create graphics from the API that other volunteers use and expect now to see it as part of the project. There is a challenge to project coordinators what to do with such volunteers – should they be part of the core project?
Next, there are questions about roles – giving the end users enough possibilities is one option, while another way is to construct modularising choices, to allow people to combine them in different ways. In ScienceatHome they have decided to put people into specific modes so consciously changing activities. There is wide variety of participants – some want to be fairly passive and low involvement, while other might want to do much more. Also creativity can express itself in different forms, which are not always seem linked to the project. The learning from Citizen Cyberlab is that there isn’t simple way of linking creativity and capture it in computer software, but that you need organisational structure and most importantly, awareness to look out for it and foster it to help it develop. Having complementarity – e.g. bringing game people and science people to interact together is important to creativity. Another point is to consider is to what degree people progress across citizen science projects and type of activities – the example of Rechenkraft.net that without the hackspace it was not possible to make things happen. So it’s volunteers + infrastructure and support that allow for creativity to happen. There are also risks – creating something that you didn’t know before – ignorance – in music there isn’t much risk, but in medical or synthetic biology there can be risks and need to ask if people are stopping their creativity when they see perceived risks.

wpid-wp-1442679513070.jpgThe final session of the summit was dedicated to Evaluation and Sustainability. Starting with The DEVISE project – Tina Philips (Cornell Lab of Ornithology). Tina is involved in the public engagement part of Cornell Lab of Ornithology . Starting from the work on the 2009 of the Public Participation in Scientific Research (PPSR) report – the finding from the CAISE project that scarcity of evaluations, higher engagement suggested deeper learning, and need for a more sensitive measures and lack of overall finding that relate to many projects. The DEVISE project (Developing, Validating, and Implementing Situated Evaluation Instruments) focused on evaluation in citizen science overall – identifying goals and outcomes, building professional opportunities for people in the field of informal learning, and creating a community of practice around this area. Evaluation is about improving the overall effectiveness of programmes and projects. Evaluation is different from research as it is trying to understand strengths and weaknesses of the specific case and is less about universal rules – it’s the localised learning that matter. In DEVISE, they particularly focused on individual learning outcomes. The project used literature review, interviews  with participants, project leaders and practitioners to understand their experience. They looked at a set of different theories of learning. This led to a framework for evaluating PPSR learning outcomes. The framework includes aspects such as interest in science & the environment, self efficacy, motivation, knowledge of the nature of science, skills of science inquiry, and behaviour & stewardship. They also develop scales – short surveys that allow to examine specific tools – e.g. survey about interest in science and nature or survey about self-efficacy for science. There is a user guide for project evaluators that allow to have plan, implement and share guidance. There is a logic model for evaluation that includes Inputs, activities, outputs, short-term and long-term impacts. It is important to note that out of these, usually short and long terms outcomes are not being evaluated. Tina’s research looked at citizen science engagement, and understand how they construct science identity. Together with Heidi Ballard, they looked at contributory, collaborative and co-created projects – including Nestwatch, CoCoRaHS, and Global Community Monitor. They had 83 interviews from low , medium and high contributors and information from project leaders. The data analysis is using qualitative analysis methods and tools (e.g. Nvivo). The interview asked about engagement and what keep participants involved and asking about memorable aspects of their research involvement. There are all sort of extra activities that people bring into interviews – in GCM people say ‘it completely changes the way that they respond to us and actually how much time they even give us because previously without that data, without something tangible’ – powerful experiences through science. The interviews that were coded show that data collection, communicating with others and learning protocols are very common learning outcomes. About two-third of interviewees are also involved in exploring the data, but smaller group analyse and interpret it. Majority of people came with high interest in science, apart of the people who are focused on local environmental issues of water or air quality. Lower engagers tend to feel less connected to the project – and some crave more social outlets. The participants have a strong understanding of citizen science and their role in it. Data transparency is both a barrier and facilitator – participants want to know what is done with their data. QA/QC is important personally and organisationally important. Participants are engaged in wide range of activities beyond the project itself. Group projects may have more impact than individual projects.
Following the presentation, the discussion explore the issue of data – people are concerned about how the data is used, and what is done with it even if they won’t analyse it themselves. In eBird, you can get your raw data, and checking the people that used the data there is the issue of the level in which those who download the data understand how to use it in an appropriate way. 

wpid-wp-1442679499689.jpgThe final guest presentation was Agroecology as citizen science – Peter Hanappe (Sony Computer Science Lab, Paris).  Peter is interested in sustainability, and in previous projects he was involved in working on accessibility issues for people who use wheelchair, the development of NoiseTube, or porting ClimatePrediction BOINC framework to PlayStation, and reducing energy consumption in volunteer computing. In his current work he looks at sustainability in food systems. Agroecology is the science of sustainable agriculture, through reducing reliance on external inputs – trying to design productive ecosystems that produce food. Core issues include soil health and biodiversity, with different ways of implementing systems that will keep them productive. The standard methods of agriculture don’t apply, and need to understand local conditions and the practice of agroecology is very knowledge intensive. Best practices are not always studied scientifically – with many farms in the world that are small (below 2 hectares, 475 millions farms across the world). There are more than 100M households around the world that grow food.  This provide the opportunity for citizen science – each season can be seen as an experiment, with engaging more people and asking them to share information so the knowledge slowly develops to provide all the needed details. Part of his aim is to develop new, free tools and instruments to facilitate the study of agroecology. This can be a basic set with information about temperature and humidity or more complex. The idea to have local community and remote community that share information on a wiki to learn how to improve. Together with a group of enthusiasts that he recruited in Paris, they run CitizenSeeds where they tried different seeds in a systematic way – for example, with a fixed calendar of planting and capturing information People took images and shared information online. The information included how much sunlight plants get and how much humidity the soil have. on p2pfoodlab.net they can see information in a calendar form. They had 80 participants this year. Opportunity for citizen science – challenges include community building, figuring out how much of it is documentation of what worked, compared to experimentation – what are the right way to carry out simple, relevant, reproducible experiments. Also if there is focus on soil health, we need multi-year experiments.  

I opened the last two Demonstrations of the session with a description of the 
Extreme Citizen Science pilots – starting similarly to the first presentation of the day, it is useful to notice the three major period in science (with regard to public participation). First, the early period of science when you needed to be wealthy to participate – although there are examples like Mary Anning, who. for gender, religion and class reasons was not accepted within the emerging scientific establishment as an equal, and it is justified to describe her as citizen scientists, although in full time capacity. However, she’s the exception that point to the rule. More generally, not only science was understood by few, but also the general population had very limited literacy, so it was difficult to engage with them in joint projects. During the period of professional science, there are a whole host of examples for volunteer data collection – from phenology to meteorology and more. As science became more professional, the role of volunteered diminished, and scientists looked for automatic sensors as more reliable mean to collect information. At the same time, until the late 20th century, most of the population had limited education – up to high school mostly, so the tasks that they were asked to perform were limited to data collection. In the last ten years, there are many more people with higher education – especially in industrialised societies, and that is part of the opening of citizen science that we see now. They can participate much more deeply in projects.
Yet, with all these advances, citizen science is still mostly about data collection and basic analysis, and also targeted at the higher levels of education within the population. Therefore, Extreme Citizen Science is about the extremities of citizen science practice – engage people in the whole scientific process, allow them to shape data collection protocols, collect and analyse the data, and use it in ways that suit their goals. It is also important to engage people from all levels of literacy, and to extend it geographically across the world.
The Extreme Citizen Science (ExCiteS) group is developing methodologies that are aimed at facilitating this vision. Tool like GeoKey, which is part of the Cyberlab project, facilitate community control over the data and decision what information is shared and with whom. Community Maps, which are based on GeoKey are way to allow community data collection and visualisation, although there is also a link to EpiCollect, so mobile data collection is possible and then GeoKey managed the information.
These tools can be used for community air quality monitoring, using affordable and accessible methods (diffusion tubes and borrowed black carbon monitors), but also the potential of creating a system that will be suitable for people with low level of literacy. Another pilot project that was carried out in Cyberlab included playshops and exploration of scientific concepts through engagement and play. This also include techniques from Public Lab such as kite and balloon mapping, with potential of linking the outputs to community maps through GeoKey. 

 Finally, CCL Tracker was presented by Jose Luis Fernandez-Marquez (CERN) – the motivations to create the CCL tracker is the need to understand more about participants in citizen cyberscience projects and what they learn. Usual web analytics  provide information about who is visiting the site, how they are visiting and what they are doing. Tools like Google analytics – are not measuring what people do on websites. We want to understand how the 20% of the users doing 80% of the work in citizen cyberscience projects and that require much more information. Using an example of Google Analytics from volunteer computing project, we can see about 16K sessions, 8000 users, from 108 countries and 400 sessions per day. Can see that most are males – we can tell which route they arrived to the website, etc. CCL tracker help to understand the actions performed in the site and measure participants contribution. Need to be able to make the analytics data public and create advanced data aggregation – clustering it so it is not disclosing unwanted details about participants. CCL tracker library work together with Google tag manager and Google analytics. There is also Google Super Proxy to share the information. 

Citizen Cyberlab Summit (day 1)

wpid-wp-1442503181050.jpgThe Citizen Cyberlab Summit is the final event of the Citizen Cyberlab project. The name might sound grand, but the event itself was fairly intimate and focused, with about 40 participants from across the world. The aim of the event was to share the learning from the project and compare them to similar activities around the world. It also provided an opportunity to consider, with experts from different areas, the directions that the project partners should progress beyond the specific ‘deliverables’ (outcomes and outputs) of the project. The meeting was held in the Confucius institute of the University of Geneva which has a mission to improve scientific diplomacy and international links between researchers, so it was a suitable venue for the such international scientific meeting.

 Introduction to Citizen Cyberlab was provided by Ariel Lindner (UPD) who is the main project leader. He noted that the starting point of citizen cyberlab is that we know that people learn better by doing, and that working with the public is also beneficial for scientists – both for becoming aware of public concerns as well as the moral obligation to share the results of research with those who fund it.  The citizen cyberlab project, which is in its final months, was based on 3 parts – platforms, pilots, and tools. Platforms that are aimed at lowering the barriers for participation for scientists and citizens (computation and participation platforms). The platforms are tested through pilot projects, which are then evaluated for creativity and learning – exploring learning behaviour, creativity and community engagement. We aim to share the successful experiences but also the challenges that emerged through the various activities. In the computation platforms, we developed CitizenGrid is aimed to allow running cloud-based projects; RedWire, a new way to consider game design – creating an open source game engine with open game analytics (the ability to measure what people do with the games). Example of this was in the development of science games; GeoKey is the final platform, and it allow people to share their concerns and control information. The project pilots included Virtual Atom Smasher which is about learning particle physics and helping scientists; GeoTag-X at UNITAR helping in disaster response; SynBio4All which open up synthetic biology to wider audience – with games such as Hero Coli and a MOOC on DIY synthetic biology (through iGEM) – with activities around ‘the smell of us’ about the odour that people emit and identifying the bacteria that influence it. L’Oréal is interested in developing this research further; There are several Extreme Citizen Science pilots, too. The tools that were developed in the project included creativity tools such as IdeaWeave.io to explore and develop ideas, monitoring learning (CCL-Tracker), and EpiCollect+ system to allow data collection for a wide range of projects.
Aspects of creativity and understanding what people learn are both complex tasks – understanding the learning had to be done on other communities in citizen science, finally there is specific effort on community engagement through social media and media outlets (YouTube and Audio).

The rest of the event was structured as follows: after two short presentations from guest speakers from outside the project consortium, two demonstrations of specific platform, tool, pilot or learning was followed, and the session ended with discussion in groups, which were then shared back. In all, the summit had 4 such sessions.

wpid-wp-1442502888908.jpgFollowing this introduction, two guests gave Short Talks, first about World Community Grid (WCG) – Juan Hindo (IBM). Juan provided details of WCG which is part of IBM corporate citizenship group. WCG is philanthropic programme that support participation in science through distributed computing to allow scientists to access large scale computing by using unused processing in computers and mobile devices. The projects can be ‘the biggest and most fundamentally important activities in labs’ according to researchers who participate in the programme. Examples of success include new solar materials from Harvard university researchers, with thousands of candidate materials. Other breakthroughs happened in childhood cancer research and computing for clean water that was led by Tshinghua University in China – exploring the use of nano-tubes for water filtration. WCG are promoting Open Science – ask researcher to make the data publicly available, focus on humanitarian research, real tangible science, with IBM support. Using the corporate ability, they get lots of attention in media. They try to engage volunteers as much as possible – they carried out an extensive volunteers study 2 years ago. Demographic – mostly man, technical background, 20-40, who usually volunteer for 5 years, and people join because they want to help science. Learning about the science is a reason to stay. People want to understand the impact of the computations that they perform – beyond just statics and asking information to be understandable. WCG are trying now to build a more diverse volunteer base, more approachable scientific content and articulating the value of contribution. They see opportunity to reach out to young people, women and they try to engage people through the story about the science, and ensuring people that the process is safe – evaluating experience and design to take a short time. They also want to leverage existing volunteers – they set up a recruitment competition for existing volunteers – that led to very few new people joined. They also do use of social media on Twitter, YouTube and Facebook. There is growing engagement with social media, but not enough conversion to volunteering. They also deal with layering of information with researchers, ask for consistent and regular updating on the research and give volunteer control over communication that they receive. Articulating contribution value is to highlight research stories – not just computations and number of volunteers and celebrating and promote scientific success – they lean on networks in IBM to share the word out. The campaign helped in doubling the registration rate to the system. They want to reach more volunteers, and they follow conversion rate – they are missing stories from volunteers and have a volunteer voice, remove barriers to entry, recruitment drive didn’t create. They want to expand research portfolio and want other areas that it can support. 

In the discussion that followed the important of IP, treating volunteers as individuals came up as a topic that worth exploring with volunteer computing project.

wpid-wp-1442566393048.jpgThe next presentation was Science@home –  by Jacob Sherson (University of Aarhus, Denmark). Jacob noted that in citizen science there are different difficulty level and opportunity to user innovation. In Science@home they are trying to extend the range of citizen science involvement with students. They are talking about the creativity research – trying to evaluate creativity with a positivist empirical framework – controlling different variables and evaluating creativity of output according to it. They run scienceathome.org – with 3000 people participating in projects, with experiments ranging from cognitive science, to quantum physics, and business administration – and they have an interdisciplinary team from different areas of research to support the development of the system. An example for the type of project that they deal with is quantum computing – manipulations of electrons – they are sloshing around between states when moving them with laser beams. Using analogies to high school curriculum was useful way to engage participants and make it relevant to their studies. They have discovered that students can understand quantum physics in a phenomenological way through a game interface. They discover that gamers find areas of good region for solutions. The players localised area of the big parameters space – faster than computer simulation. They also studying the formation of strategies in people mind – Quantum Minds. With this programme, they are studying the process of learning the project and mastering it. They looked at the way to people who learn how to solve problems – to see if early performance help to predict the ability to learn the topic. Other games include trying to understand innovations in the Alien Game. They also have behavioural economics game about forming of groups. The educational part is about creativity – thinking of motivations for curriculum and fun with different resources. Game based education is assumed to improve the curriculum and can increase the motivation to learn. The general approach is to provide personalised online learning trajectories – identify types of students and learners and then correlate them and create personalised learning experience. Also want to train researchers to help them explore. 

The next part of the morning session were the 2 Demonstrations starting with EpiCollect – David Aanensen (Imperial College). EpiCollect was created to deal with infectious disease – who, what, where and when – getting the information about genetic make-up of diseases. They realised that there is a generic issue of metadata gathering and the tool evolved into generic forms collection and visualisation tool. The current use of EpiCollect includes a lot of projects in veterinary as GPS monitoring of animals is easier in terms of ethics. It was also used by the Food and Agricultural Organisation (FAO) to monitor the provision of food to communities in different parts of the world. Also used in education projects in Bath university in field courses (building on evolution megalab project to collect information about snails) with students building questionnaire based on the information sheets of the project. They are starting to build longitudinal data. There are projects that link EpiCollect to other systems – such as GeoKey and CartoDB for visualisation.  

Red Wire  was presented by Jesse Himmelstein (University Paris Descartes) -Red Wire is a platform that is aimed at reducing the barrier of creating games for citizen science through a mash-up approach – code and games are open access to encourage reuse. It use functional programming language approach – in a visual programming environment. They are taking metaphors from electronics. There are examples of games that student developed during recent summer schools and other activities. 

CitizenGrid was discussed by John Darlington (Imperial College, London). Citizen Grid is a platform that enables replicating projects on cloud computing, specifically for volunteer computing projects. It can allow unified support to volunteer computing – support for the scientists who are setting a project, but also to the volunteers who want to link to the project. The scientists can map their resources through creation of both client and server virtual machines and register the application. They demonstrated it with projects that also use games – allowing to install the application on local machines or cloud computing.   

wpid-wp-1442502824236.jpgIn the breakout groups, participants discussed the complexity of the platforms and what are the next steps to make them more accessible. For Epicollect, there are challenges of identifying who are the users – they the both the coordinators and the data collectors, and helping them in setting useful project is challenging, especially with the need for usability and user experience expertise. Dealing with usability and user experience is a challenge that is common to such projects. For RedWire, there is a need to help people who do not have any programming experience to develop games, so these are scientists and teachers. Maybe even gemify the game engine with credits to successful game designers who create components that can be remixed. For citizen grid, there is a need for examples of use cases, with currently Virtual Atom Smasher as the main demonstrator.

The afternoon session explored Pilot Projects. CERN@School – Becky Parker (Langton Star Centre) described how she developed, with her students and collaboration with scientists the ability to do science at school. The project is a demonstration how students and teachers can become part of the science community. The project started years ago with students contributing to astrophysics research. The school is involved in fundamental research, with a 17 years old student publishing scientific paper based on theoretical physics research problem that was presented to the students from professional scientists. Her students also put together to put an instrument to detect cosmic rays on the satellite TDS-1. They can see where is their experiment through visualisation over Google Maps that the students developed themselves. Students also created analysis tools for the data. Students can contribute to NASA research on the impact of cosmic rays on International Space Station staff. CERN@School also include experiment in collecting radiation reading which help to map background radiation in the UK (by students at 14-15). Through their work, they discovered that there aren’t many radiation reading in the ocean, and they will do that by mounting a radiation sensor to sea UAV. All this helps students to learn to be scientists. They created the monopole-quest project within the zooniverse projects. It is possible to get young people involved in large scale science projects. It also help to encourage science teachers and to ensure job satisfaction for teachers. The involvement of girls in the project also lead to more participation in science and engineering after school with the school having a disproportionate share of the number of young women who go to study such topics in the UK. 

Rechenkraft.net – From Volunteers to Scientists – Michael Weber (Uni Marburg). Michael describe how volunteers turned to scientists in the area of volunteer computing. Rechenkraft started in 2005 with a forum dedicated to all distributed computing projects around the world, and sharing the information about them among German speaking volunteers. Projects are now being translated to other languages, too. This led to the creation of an organisation, which is now involved in many projects, including climateprediction.net.  volunteers also created monitoring programmes that indicate the process and provide statistics about contributions. They also have yearly face to face gathering of volunteers from across Germany and beyond, with results of creating their own data processing racks and other initiative. Started in electronic sports league but then realised that there are opportunities to assist scientists in developing new projects, and that led to Yoyo@home that will allow the community to help scientists in developing BOINC projects. They regularly participate in conferences and exhibitions to promote the opportunity to other people interested in technology, and they became part of Quake-catcher network. They receive significant press coverage – eventually the city of Marburg (Germany) offered the organisation physical pace that became the Hackspace of the city. Once there is a steady place, they created more sophisticated cluster computers. They also set up the WLAN in the local refugee camp. Finally, they also develop their own scientific project- RNA world which is completely internal project. They encountered problems with very large output files from simulations so they are learning about running distributed computing projects as scientists who use the results and not only as volunteers. They also starting to run different projects about tree health with data recording such as location, photo and plant material.   Similarly, they map protected flowers – all this on volunteer basis. They participate in the effort of developing citizen science strategy 2020 for Germany, and they would like funding to be available to average person so they can participate in projects. There is risk that citizen science will be co-opted by scientists – need to leave space for grass-roots initiatives. There are also barriers for publications. The need for lab results in addition to the simulation encouraged the creation of the wet lab. 

The last short guest talk came from Bernard Revaz who suggested to create Massive Multiplayer Online Science – using game environments like WoW (World of Warcraft) to do science. His aim is inject science into projects such as Eve online – at a given time there are 40,000 users, median age 35, with 50% with degree in science. In Eve online they design an element from the human protein atlas that the gamers will help to classify. The stakeholders in their discussion include scientists,  the gaming company and players and all are very positive about the prospect. In Eve online there are many communities – they are creating a new community of scientists so people join it voluntarily. Working on matching the science tasks to the game narrative and to the game reward system.

After these two guest talks, there were two Demos. 

wpid-wp-1442502761020.jpgFirst, Virtual Atom Smasher (VAS) – Ioannis Charalampidis (CERN) – the VAS is about the way CERN develop the science cycle -observe the situation, lead to theory by theoretical physicists and then carry out experiments to test them. The process includes computer simulations that are explored against experimental data. They are trying to adjust the models until the model reflect the results.VAS evolved from a project by  15 years old student in 2010, who managed to create the best fitting results of a simulation. The VAS is about real cutting edge science, but it is also very challenging and created a game (but don’t use the word game – it’s a simulation). The VAS use CitizenGrid and RedWire for the game and CCL tracker to understand the way people use the platform. The analytics show the impact of training to the desired flow of the game. The VAS combines exploration with opportunities for learning. 

Geotag-X – Eleanor Rusack (UNITAR). This is a platform to crowdsource the analysis of images in humanitarian crises. They usually use satellite imagery to deal with crises, but there are limitations to some images – roofs, clouds etc., and there is a need to know what is going on the ground. The idea is to harvest photos coming from disaster , then analyse them and share the knowledge. A lot of information in photos can be very useful – it’s possible to extract structural information and other details in the image. They got a workflow, who set projects, they then develop the structure of the processing and tutorials, and tools for photo collection tools (from Flickr, Twitter, EpiCollect and Chrome extension). The photos are added to the analysis pool. They have created a project to allow people deal with Yemeni Cultural Heritage at risk as  a result of the way that is happening there. The syste is mostly based on self learning. Geotagging photo is a challenging tasks. It’s a specially an area that need more work. The experts are professionals or academics in specific domain who can help people to design the process, while participants are coming from different backgrounds. They are recruiting people through SciStarter, Mozilla science etc. The keep in touch with online volunteer groups – people who come from SciStarter tend to stay. Digital volunteers also help a lot and they encourage volunteering through presentation, but most important are data sprints. They use evaluation of agreement between analysts – agreement show easy to agree. There is a range of responses to agreement across standard deviation: they identify 3 groups – easy (high  agreement, low standard deviation), mid (high std div and median agreement) and complex (low agreement, low std div). Analysis of images against these agreement level help to improve designs. The want to move the questions up the curve and how to train large number of analysts when project leaders have limited time? 

The follow up discussion explored improvements to VAS – such as integrating arts or linking a BOINC project that will contribute computing resources to the VAS. For Geotag-X, the discussion explored the issue of training – with ideas about involving volunteers in getting the training right, run virtual focus groups or exploring design aspects and collaborations between volunteers.

Building Centre – from Mapping to Making

The London based Building Centre organised an evening event – from Mapping to Making –  which looked at the “radical evolution in the making and meaning of maps is influencing creative output. New approaches to data capture and integration – from drones to crowd-sourcing – suggest maps are changing their impact on our working life, particularly in design.”  The event included 5 speakers (including me, on behalf of Mapping for Change) and a short discussion.

Lewis Blackwell of the Building Centre opened the evening by noting that in a dedicated exhibition on visualisation and the city, the Building Centre is looking at new visualisation techniques. He realised that a lot of the visualisations are connected to mapping – it’s circular: mapping can ask and answer questions about the design process of the build environment, and changes in the built environment create new data. The set of talks in the evening is exploring the role of mapping.

Rollo Home, Geospatial Product Development Manager, Ordnance Survey (OS), started by thinking about the OS as the ‘oldest data company in the world‘. The OS thinking of itself as data company – the traditional mapping products that are very familiar represent only 5% of turnover. The history of OS go back to 1746 and William Roy’s work on accurately mapping Britain. The first maps produced in Kent, for the purpose of positioning ordinances. The maps of today, when visualised, look somewhat the same as maps from 1800, but the current maps are in machine readable formats that mean that the underlying information is very different. Demands for mapping changed over the years: Originally for ordinances, then for land information and taxation, and later helping the development of the railways. During WW I & II the OS led many technological innovations – from national grid in 1930s to photogrammetry. In 1973 the first digital maps were produced, and the process was completed in the 1980s. This was, in terms of data structures, still structured as a map. Only in 2000, MasterMap appear with more machine readable format that is updated 10,000 times a day, based on Oracle database (the biggest spatial data in the world) – but it’s not a map. Real world information is modelled to allow for structure and meaning. Ability to answer questions from the database is critical to decision-making. The information in the data can become explicit to many parts of the information – from the area of rear gardens to height of a building. They see developments in the areas of oblique image capture, 3D data, details under the roof, facades and they do a lot of research to develop their future directions – e.g. challenges of capturing data in cloud points. They see data that come from different sources including social media, satellite, UAVs, and official sources. Most of Smart Cities/Transport etc. areas need geospatial information and the OS is moving from mapping to data, and enabling better decisions.

Rita Lambert, Development Planning Unit, UCL. Covered the ReMap Lima project – running since 2012, and looking at marginalised neighbourhoods in the city. The project focused on the questions of what we are mapping and what we are making through representations. Maps contain potential of what might become – we making maps and models that are about ideas, and possibilities for more just cities. The project is collaboration between DPU and CASA at UCL, with 3 NGOs in Lima, and 40 participants from the city. They wanted to explore the political agency of mapping, open up spaces to negotiate outcomes and expand the possibilities of spatial analysis in marginalised areas in a participatory action-learning approach. The use of technology is in the context of very specific theoretical aims. Use of UAV is deliberate to explore their progressive potential. They mapped the historic centre which is overmapped and it is marginalised through over-representation (e.g. using maps to show that it need regeneration) while the periphery is undermapped – large part of the city (50% of the area), and they are marginalised through omission. Maps can act through undermapping or overmapping. Issues are very different – from evictions, lack of services, loss of cultural heritage (people and building) at the centre, while at the informal settlement there are risks, land trafficking, destruction of ecological infrastructure, and lack of coordination between spatial planning between places. The process that they followed include mapping from the sky (with a drone) and mapping from the ground (through participatory mapping using aerial images). The drones provided the imagery in an area that changes rapidly – and the outputs were used in participatory mapping, with the people on the ground deciding what to map and where to map. The results allow to identify eviction through changes to the building that can be observed from above. The mapping process itself was also a mean to strengthen community organisations. The use of 3D visualisation at the centre and at the periphery helped in understanding the risks that are emerging or the changes to their area. Data collection is using both maps and data collection through tools such as EpiCollect+ and community mapping, and also printing 3D models so they can used by discussions and conversations. The work carries on as the local residents continue the work. The conclusion: careful consideration for the use of technology in the context, and mapping from the sky and the ground go hand in hand. Creating these new representation are significant and what is that we are producing. more information at Remaplima.blogspot.co.uk  and learninglima.net

Simon Mabey, Digital Services Lead for City Modelling, Arup. Simon discussed city modelling in Arup – with the moved from visualisation to more sophisticated models. He leads on modelling cities in 3D, since the 1988, when visualisation of future designs was done stitching pieces of paper and photos. The rebuilding of Manchester in the mid 1990s, led to the development of 3D urban modelling, with animations and created an interactive CDROM. This continued to develop the data about Manchester and then shared it with others. The models were used in different ways – from gaming software to online, and trying to find ways to allow people to use it in real world context. Many models are used in interactive displays – e.g. for attracting inward investment. They went on to model many cities across the UK, with different levels of details and area that is covered. They also starting to identify features underground – utilities and the such. Models are kept up to date through collaboration, with clients providing back information about things that they are designing and integrating BIM data. In Sheffield, they also enhance the model through planning of new projects and activities. Models are used to communicate information to other stakeholders – e.g. traffic model outputs, and also do that with pedestrians movement. Using different information to colour code the model (e.g. enregy) or acoustic modelling or flooding. More recently, they move to city analytics, understanding the structure within models – for example understanding solar energy potential with the use and consumption of the building. They find themselves needing information about what utility data exist and that need to be mapped and integrated into their analysis. They also getting mobile phone data to predict trip journeys that people make.

I was the next speaker, on behalf Mapping for Change. I provided the background of Mapping for Change, and the approach that we are using for the mapping. In the context of other talks, which focused on technology, I emphasised that just as we are trying to reach out to people in the places that they use daily and fit the participatory process into their life rhythms, we need to do it in the online environment. That mean that conversations need to go where they are – so linking to facebook, twitter or whatsapp. We should also know that people are using different ways to access information – some will use just their phone, other laptops, and for others we need to think of laptop/desktop environment. In a way, this complicates participatory mapping much more than earlier participatory web mapping systems, when participants were more used to the idea of using multiple websites for different purposes. I also mentioned the need for listening to the people that we work with, and deciding if information should be shown online or not – taking into account what they would like to do with the data. I mentioned the work that involve citizen science (e.g. air quality monitoring) but more generally the ability to collect facts and evidence to deal with a specific issue. Finally, I also used some examples of our new community mapping system, which is based on GeoKey.

The final talk was from Neil Clark, Founder, EYELEVEL. He is from an architectural visualisation company that work in the North East and operate in the built environment area. They are using architectural modelling and us Ordnance Survey data and then position the designs, so they can be rendered accurately. Many of the processes are very expensive and complex. They have developed a tool called EYEVIEW for accurate augmented reality – working on iPad to allow viewing models in real-time. This can cut the costs of producing these models. They use a tripod to make it easier to control. The tool is the outcome of 4 years of development, allow the navigation of the architectural model to move it to overlay with the image. They are aiming at Accurate Visual Representation and they follow the detailed framework that is used in London for this purpose www.eyeviewportal.com

The discussion that follow explored the political nature of information and who is represented and how. A question to OS was how open it will be with the detailed data and while Rollo explained that access to the data is complicated one and it need to be funded. I found myself defending the justification of charging high detailed models by suggesting to imagine a situation where the universal provision of high quality data at national level wasn’t there, and you had to deal with each city data model.

The last discussion point was about the truth in the mapping and the positions that were raised – It about the way that people understand their truth or is there an absolute truth that is captured in models and maps – or represented in 3D visualisations? Interestingly, 3 of the talk assume that there is a way to capture specific aspects of reality (structures, roads, pollution) and model it by numbers, while Rita and I took a more interpretive and culturally led representations.