The Global Science Forum – National Experts on Science and Technology Indicators (GSF-NESTI) Workshop on “Reconciling Scientific Excellence and Open Science” (for which you can see the full report here) asked the question “What do we want out of science and how can we incentivise and monitor these outputs?”. In particular, the objective of the workshop was “to explore what we want out of public investment in science in the new era of Open Science and what might be done from a policy perspective to incentivise the production of desired outputs.” with an aim to explore the overarching questions of:
1. What are the desirable (shorter-term) outputs and (longer-term) impacts that we expect from Open Science and what are potential downsides?
2. How can scientists and institutions be incentivised to produce these desirable outcomes and manage the downsides?
3. What are the implications for science monitoring and assessment mechanisms?”
The session that I was asked to contribute to focused on Societal Engagement: “The third pillar of Open Science is societal engagement. Ensuring open access to scientific information and data, as considered in the previous sessions, is one way of enabling societal engagement in science. Greater access to the outputs of public research for firms is expected to promote innovation. However, engaging with civil society more broadly to co-design and co-produce research, which is seen as essential to addressing many societal challenges, will almost certainly require more pro-active approaches.
Incentivising and measuring science’s engagement with society is a complex area that ranges across the different stages of the scientific process, from co-design of science agendas and citizen science through to education and outreach. There are many different ways in which scientists and scientific institutions engage with different societal actors to informing decision-making and policy development at multiple scales. Assessing the impact of such engagement is difficult and is highly context and time-dependent“.
For this session, the key questions were
- “What do we desire in terms of short and long-term outputs and impacts from societal engagement?
- How can various aspect of scientific engagement be incentivised and monitored?
- What are the necessary skills and competencies for ‘citizen scientists’ and how can they be developed and rewarded?
- How does open science contribute to accountability and trust?
- Can altmetrics help in assessing societal engagement?”
In my talk, I’ve decided to address the first three questions, by reflecting on my personal experience (so the story of a researcher trying to balance the “excellence” concepts and “societal engagement”), then consider the experience of the participants in citizen science projects, and finally the institutional perspective.
I’ve started my presentation [Slide 3] with my early experiences in public engagement with environmental information (and participants interest in creating environmental information) during my PhD research, 20 years ago. This was a piece of research that set me on the path of societal engagement, and open science – for example, the data that we were showing was not accessible to the general public at the time, and I was investigating how the processes that follow the Aarhus convention and use of digital mapping information in GIS can increase public engagement in decision making. This research received a small amount of funding from UCL, and later from ESRC, but not significantly.
I then secured an academic position in 2001, and it took to 2006 [Slide 4] to develop new systems – for example, this London Green Map was developed shortly after Google Maps API became available, and while this is one of the first participatory GIS applications on to of this novel API, this was inherently unfunded (and was done as an MSc project). Most of my funded work at this early stage of my career had no link to participatory mapping and citizen science. This was also true for the research into OpenStreetMap [Slide 5], which started around 2005, and apart from a small grant from the Royal Geographical Society, was not part of the main funding that I secured during the period.
The first significant funding specifically for my work came in 2007-8, about 6 years into my academic career [Slide 6]. Importantly, it came because the people who organised a bid for the Higher Education Innovation Fund (HEIF), realised that they are weak in the area of community engagement and the work that I was doing in participatory mapping fit into their plans. This became a pattern, where people approach with a “community engagement problem” – so there is here a signal that awareness to societal engagement started to grow, but in terms of the budget and place in the projects, it was at the edge of the planning process. By 2009, the investment led to the development of a community mapping system [Slide 7] and the creation of Mapping for Change, a social enterprise that is dedicated to this area.
Fast forward to today [Slide 8-10], and I’m involved in creating software for participatory mapping with non-literate participants, that support the concept of extreme citizen science. In terms of “scientific excellence”, this development, towards creating a mapping system that anyone, regardless of literacy can use [Slide 11] is funded as “challenging engineering” by EPSRC, and as “frontier research” by the ERC, showing that it is possible to completely integrated scientific excellence and societal engagement – answering the “reconciling” issue in the workshop. A prototype is being used with ZSL to monitor illegal poaching in Cameroon [Slide 12], demonstrating the potential impact of such a research.
It is important to demonstrate the challenges of developing societal impact by looking at the development of Mapping for Change [Slide 13]. Because it was one of the first knowledge-based social enterprises that UCL established, setting it up was not simple – despite sympathy from senior management, it didn’t easily fit within the spin-off mechanisms of the university, but by engaging in efforts to secure further funding – for example through a cross universities social enterprise initiatives – it was possible to support the cultural transformation at UCL.
There are also issues with the reporting of the impact of societal engagement [Slide 14] and Mapping for Change was reported with the REF 2014 impact case studies. From the universities perspective, using these cases is attractive, however, if you recall that this research is mostly done with limited funding and resources, the reporting is an additional burden which is not coming with appropriate resources. This lack of resources is demonstrated by Horizon 2020, which with all the declarations on the importance of citizen science and societal engagement, dedicated to Science with and for Society only 0.60% of the budget [Slide 15].
We now move to look at the experience of participants in citizen science projects, pointing that we need to be careful about indicators and measurements.
We start by pointing to the wide range of activities that include public engagement in science [Slide 17-18] and the need to provide people with the ability to move into deeper or lighter engagement in different life stages and interests. We also see that as we get into more deep engagement, the number of people that participate drop (this is part of participation inequality).
For specific participants, we need to remember that citizen science projects are trying to achieve multiple goals – from increasing awareness to having fun, to getting good scientific data [Slide 19] – and this complicates what we are assessing in each project and the ability to have generic indicators that are true to all projects. There are also multiple learning that participants can gain from citizen science [Slide 20], including personal development, and also attraction and rejection factors that influence engagement and enquiry [Slide 21]. This can also be demonstrated in a personal journey – in this example Alice Sheppard’s journey from someone with interest in science to a citizen science researcher [Slide 22].
However, we should not look only at the individual participant, but also at the communal level. An example for that is provided by the noise monitoring app in the EveryAware project [Slide 23] (importantly, EveryAware was part of Future Emerging Technologies – part of the top excellence programme of EU funding). The application was used by communities around Heathrow to signal their experience and to influence future developments [Slide 24]. Another example of communal level impact is in Putney, where the work with Mapping for Change led to change in the type of buses in the area [Slide 25].
In summary [Slide 26], we need to pay attention to the multiplicity of goals, objectives, and outcomes from citizen science activities. We also need to be realistic – not everyone will become an expert, and we shouldn’t expect mass transformation. At the same time, we shouldn’t expect it not to happen and give up. It won’t happen without funding (including to participants and people who are dedicating significant time).
The linkage of citizen science to other aspects of open science come through participants’ right to see the outcome of work that they have volunteered to contribute to [Slide 28]. Participants are often highly educated, and can also access open data and analyse it. They are motivated by contribution to science, so a commitment to open access publication is necessary. This and other aspects of open science and citizen science are covered in the DITOs policy brief [Slide 29]. A very important recommendation from the brief is that recognition that “Targeted actions are required. Existing systems (funding, rewards, impact assessment and evaluation) need to be assessed and adapted to become fit for Citizen Science and Open Science.”
We should also pay attention to recommendations such as those from the League of European Research Universities (LERU) report from 2016 [Slide 30]. In particular, there are recommendations to universities (such as setting a single contact point) and to funders (such as setting criteria to evaluate citizen science properly). There are various mechanisms to allow universities to provide an entry point to communities that need support. Such a mechanism is called “science shop” and provide a place where people can approach the university with an issue that concerns them and identify researchers that can work with them. Science shops require coordination and funding to the students who are doing their internships with community groups. Science shops and centres for citizen science are a critical part of opening up universities and making them more accessible [Slide 31].
Universities can also contribute to open science, open access, and citizen science through learning – such as, with a MOOC that designed to train researchers in the area of citizen science and crowdsourcing that we run at UCL [Slide 32].
In summary, we can see that citizen science is an area that is expanding rapidly. It got multifaceted aspects for researchers, participants and institutions, and care should be taken when considering how to evaluate them and how to provide indicators about them – mix methods are needed to evaluate & monitor them.
There are significant challenges of recognition: as valid excellent research, to have a sustainable institutional support, and the most critical indicator – funding. The current models in which they are hardly being funded (<1% in NERC, for example) show that funders still have a journey between what they are stating and what they are doing.
Reflection on the discussion: from attending the workshop and hearing about open access, open data, and citizen science, I left the discussion realising that the “societal engagement” is a very challenging aspect of the open science agenda – and citizen science practitioners should be aware of that. My impression is that with open access, as long as the payment is covered (by funder or the institution), and as long as the outlet is perceived as high quality, scientists will be happy to do so. The same can be said about open data – as long as funders are willing to cover the costs and providing mechanisms and support for skills, for example through libraries then we can potentially have progress there, too (although over protection over data by individual scientists and groups is an issue).
However, citizen science is opening up challenges and fears about expertise, and perceptions about it risking current practices, societal status, etc. Especially when considering the very hierarchical nature of scientific work – at the very local level through different academic job ranking, and within a discipline with specific big names setting the agenda in a specific field. These cultural aspects are more challenging.
In addition, there seem to be a misunderstanding of what citizen science is and mixing it with more traditional public engagement, plus some views that it can do fine by being integrated into existing research programmes. I would not expect to see major change without providing a clear signal through significant funding over a period of time that will indicate to scientists that the only way to unlock such funding is through societal engagement. This is not exactly a “moonshot” type funding – pursue any science that you want but open it. This might lead to the necessary cultural change.