Citizen Science 2019: Getting your project off the ground workshop

DSC_1504.JPGThe workshop was held as part of the Citizen Science 2019 conference and organised by Katrina Theisz (National Institutes of Health), Jennifer Couch (National Science Foundation), Ellen McCallie (National Science Foundation), Alison Parker (EPA), Pietro Michelucci (Human Computation Institute, Inc) , Claire Baert (Thin Crowd)

DSC_1503.JPGEllen McCallie started the day on how to write a competitive proposal for federal agencies. Looking at the experience of NSF, NOAA, and NIH. There is an existing community of people who are interested in citizen science and supporting it in within the funding agencies. Ellen role in the NSF and supporting informal STEM learning – from TV, crowdsourcing, to citizen science. John McLaughlin from NOAA got involved in GLOBE programme and became more and more engaged, through the office of education of NOAA. Liam O’Fallon is at NIH, in the environmental health institute – from involvement in global health, and focusing on environmental justice and community-based participatory research.

In terms of the proposal – what are the funding programme that is right for you? If you are going after a programme that is not focused on your attention, then there will be mismatched. Learning the criteria of the research programme, and you have to remember what kind of people will be involved in the evaluation – for example, in the NSF informal education area, they include practitioners, museums, and people from outside academia. A proposal needs to explain the vision and the minute details – you need to get the details clear enough. Knowing the audience for the proposal is critical – e.g. NSF covers sci, tech, eng & math as basic research – and you need to ask what is the intellectual centre of gravity of the proposal. For the NSF, it has to be basic research, while the NIH is the biomedical research institutes, the NIEHS has a strong emphasis on community-engaged research approaches and environmental health literacy. The NIH does support applied research on resources, models and tools. NIH requires hypothesis-driven research. For NOAA, they focus on climate, weather, oceans, and coasts. NOAA’s citizen science projects are run in collaboration as part of their monitoring element. The research programmes that create new programmes and then at the end of the project, there isn’t a clear continuation of funding and the whole issues of continuity.

In terms of an organisation structure, there are limitations of funding in terms of the type of organisations that can receive US federal funding because of financial and reporting requirements, so small actors (say an innovator) need to join forces with an organisation that can work with such funding, and be a contractor of some form. In NOAA and NIH, there is a potential of working with a US entity and then to team up with international partners – but this is dependent on eligibility in a specific call. There are issues with community organisations and the need to team up with bigger organisations that are taking a significant part of the funding.

In NIH, the citizen science and crowdsourcing have people with a specific role to support community science and citizen science research, and ability to work with community-led organisations.

In NSF, there isn’t a specific citizen science programme, and they try to do it throughout the programme – it can be part of the methodology or part of the public engagement element. The NSF focuses on the science of the proposal and suggesting the PPSR as integrated into the rest of the range. Looking at cit sci and crowdsourcing is now being looked at as a tool in the toolkit following the Holden memo on citizen science in 2015, o that changed the aim. It is a tool that can be used in multiple areas – education, outreach etc. The citizen science and crowdsourcing act also made a difference, and there is still an effort for the cultural change inside the funding organisations. The community of practice on citizen science mean that when proposals that include citizen science are getting into NOAA, people like John are being asked to help and select appropriate reviewers. The community of practice at citizenscience.gov is providing help to Federal Agencies in linking to people who understand the issues of citizen science and crowdsourcing. In NIH, the open call responses, there is an issue about identifying the right fit for investigator-initiated research proposals. In NSF, informal is the leading directorate, then bio, and then education. The NSF set an agency goal of learning about PPSR across the whole set of directorates, so there is knowledge about the existence of PPSR. There is an internal work inside the agencies to educate and inform the different officers to provide appropriate support for the reviewing process.

Back to proposals: call the call text at least three times. Pay attention to what kinds of ideas are requested – check previous awards. Pay special attention to the criteria. Check what was funded before in their existing projects and funded in the past, and you must acknowledge them, and learn from previous projects. Read the funding officer when you have a one page summary of the project idea – explain how it matches the criteria. Worth talking with a colleague – project officers are very careful not to give away ideas. It is OK to call people who are responsible for the programme. In some regions of NOAA, the success rate is high with small grants. In NSF/NIH/NOAA the success rate is close to 10% with many high-quality projects that are not funded.

A lot of PPSR is coming in informal STEM education within the NSF. Some programmes are repeated across directorates. The NSF has established 10 big ideas that are integrated into many programmes – and NSF Includes (about broadening participation). There is also a dedicated website for Informal science: http://informalscience.org/search-results?search_api_views_fulltext=PPSR. NSF continues to use Public Participation in Scientific Research as its core term.

In terms of criteria, the two critical criteria are intellectual merit and broader impact. The NSF has a very strict anonymity principle of not publishing anything about who is in the panel, or reviewed. When a panel image is shared, the whole panel is nullified and the whole process starts from scratch. NIH share information about panel membership after the panel process. Reviewing is 6-10 months process until answer.

Mistakes in the proposal – “trust me” (without references of evidence), oversell as the best thing ever. General, vague, rambling. Overemphasis of the rationale for the project at the expense of details of what will be done. Kitchen sink proposal that includes everything and hopes that it’s interesting for reviewers. Not having a properly qualified team. Remember deadlines.

Check also this thread from @CitSciBio:


Panel on specific project stories:

Seth Cooper (Foldit) – development and support – how we did it. Foldit is the multiplayer game – been around 10 years, over 500,000 participants. Have puzzles and leaderboard to do the work in this area. Players managed to create new algorithms for refining, and redesigning enzymes. There are new proteins that are being designed. Foldit is across many universities – UW, northeastern, Vanderbilt, UC Davis and UoM Dartmouth. To continue and run it, lots of people work on it and lots of people try and sustain the projects. The Rosetta Commons is an organisation that helps the collaboration – you can see it in the credits on the fold.it portal. Started as a PhD student, and now a faculty member so not part of the original funding. The funding came from DARPA, NSF, NIH, NHMI, Microsoft, Adobe, and RosettaCommons was able to give funding – putting lots of funding from multiple sources. For example, looked at NIH Big Data to Knowledge (BD2K call). Another one is the NSF call for enhancing community infrastructure. How to continue a project and maintain the funding to it over time – the science research infrastructure is about maintaining activities over time. There are also support from Amazon with cloud credits for research. They are developing a Tile-o-scope demo of AT citizen science.

DSC_1507.JPGDarlene Cavalier – describe her journey from Discover Magazine on publishing, but got interested in the science issues. She looked at areas when it is possible to contribute more to science. During her masters’ studies, stumble on participatory research and citizen science and started as a blog in graduate school on projects in the area of participatory research. In 2006 there was a blog, which then evolved into a grant in 2010 with the development of a website that provides support to people who want to do something – curiosity and concerns are main motivators. The first grant was secured from NSF, and then continue to develop it since – they also embed information about projects into places that people already use: from magazines to PBS. There are waves of funding to SciStarter, and they go after funding on things that interest them. Joining ASU in the school for innovation. SciStarter collected much data about referral sources (e.g. through The Crowd and The Cloud), or the journey of participants, and a lot of metadata about the participants’ journeys. Working in collaboration with universities who are contributing to the platform as part of their grants. There is support on the project, and there are also services – e.g. girls scouts. A specific interest group can have their own interest and management of their own groups. The Museum of Science also has its own portal, which can then be shared back with the portal funders. The service model provides further support. ASU grant for setting citizen science kits in libraries is a great example that shows the synergy between SciStarter and a university. Each side wouldn’t be able to do it without another.

DSC_1508.JPGAndrew Robins – QuestaGames – started in Australia. https://questagame.com/. The focus on QuesaGame is a passion in the area for 25 years. The passion is to save life on earth, wanted to deal with extinction in Australia. There is a need to engage people on screen and not engage them outdoors – kids create taxonomies within games in a fantasy world, then it is possible to engage them in taxonomy in the real world. There are expertise that dies out, as in curators and entomologists who are disappearing. QuestaGame is a mobile game to encourage people to learn about the classification of species. Working as a private company, but spending a lot of effort of securing funding to continue and develop it. Spending lots of time on securing funding. A lot of it is about building teams. Biggers revenue came from competition that people can pay for – different bioQuest that they run using the platform. Using a University bioQuest that start in April 2018. Many sitting and classifications. They also have bioexpertise.org – collective intelligence is important, and there are contextual observations and details. Citizen science should push forward to economics and making revenue streams and to build up relations. Biosecurity threats are a new revenue area – for example invasive species.

DSC_1509.JPGPietro Michelucci – the challenges of getting into people are working on collaborations between humans and machines. Background in cognitive science and worked in US funding agencies, and been involved in different panels and programmes. Citizen Science is about 2 of the most powerful tools: science as a way to deal with the world, and crowdsourcing. There are chicken and egg problem. The StallCaatcher project started from coming across a problem of identifying issues: it is not possible to solve it with machines; people care about the issue (Alzheimer); etc. There was also an existing platform – from Startdust@Home in 2006. The problem is that to start the idea you need funding, but funders want to see evidence that it is possible. Problem 1 – will funders trust the data – although the research question was about data quality, a reviewer responded that it is not possible to trust the data from the crowd (!?!). This was a failure in communication.  While waiting for funding, decided to try and start the project and there is so much to learn. Build some prototype before applying for funding.

I have also presented in the session the DITOs policy brief on business models of citizen science.

DSC_1510.JPG

The second part of the workshop looked at specific project plans and was moderated by Kelly Edwards of UW. Looking at the following questions: What is the need? What’s your vision to fill this need? How does it fit into the bigger picture? What’s the plan? What do you need – Collaborators, support, expertise, and resources.

The last part of the workshop included pitching ideas and developing basic concepts of projects and receiving feedback from other participants.

Published by

mukih

Professor of GIScience, University College London

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.