The afternoon session started with Web development insights
Taking on the Challenges of Broadening Participation in Data Visualization and Analysis with FieldScope
Daniel Edelson – BSCS – cover fieldscope that allow people to collect data, design the form, and visualise and analyse it. He covered the Chesapeake Watershed Water Quality study. The area that influences the bay is a very large area. Information is being collected at times when school is doing things, so there are issues with the variability of data collection. Challenges to collect data – very few teachers and students get to the stage of analysing. All the time allocated was used to get people to the stage of data collection, and people used certain analysis tools to understand where to collect data. They want to have flexibility in data protocols, aimed at more reliable participation, and try to get people to analysis. The lesson is that effort should be paid to more active and structured process of engagement and involving schools in the process.
Patterns of Behaviour Across Online Citizen Science
Chris Lintott – Zooniverse.org; Helen Spiers* – University of Oxford; Grant Miller – University of Oxford / Zooniverse; Lucy Fortson – University of Minnesota; Laura Trouille – Adler Planetarium. Zoonivrse is now 10 years, with many projects, and pull data from 63 projects (ecology and astronomy) from 2012 to 2016 with 146,243,599 classification dataset. Looking at different classifications – in the first 100 days post launch, there is a range of classification. Projects have a peak after the launch and drop, apart from regular communication with the. High heterogeneity in the number of unique volunteers, with more volunteers in Astronomy. There is participation inequality across the projects. What they see from google analytics is that projects appeal across projects. in astronomy more male participation, closer participation by females in ecology. There are questions about what to do with over and under-represented groups. They are also analysing user movement between projects. email@example.com
Validated Dynamic Consensus Approach for Citizen Science Projects Employing Crowd-based Detection Tasks
Pietro Michelucci – Human Computation Institute. Pietro runs with EyesonAlz and want to share problem and solutions. The goal is crowdsourced classification and wants to explore things. They had a problem with random responders with bots, also people who want to do other malicious things – so using lessons from psycho-physics – learning about separating sensitivity and bias – the operator need to decide if it is real object that requires alarm, in signal detection theory you can tease apart the sensitivity of the apparatus to the bias of the operator. When using an approach that measures the process of putting information in. Another problem is how you combine the results from the crowd. They carried out validation study and found that around 15 they get into the research threshold that they can use the data. They use 20 classifications to get high quality of data. Another problem is analytic efficiency – not to waste people time and they started assigning weights to a participant and stop when you have enough information – a paper from Willett et al. 2013 on Galaxy Zoo 2 that allow you to assess expertise. Marshall et al. 2015 Space Warps paper and extends this approach to measuring in a collective way. The number is between 2 and 10 and usually 5 so it is much better to use of people’s time.
Working Together: Developers and Project Leads
Robert Pastel – Michigan Technological University – app development is not done in a vacuum: participants, developers and project lead. For a successful application, all those core participants need to work together. The methodology includes participatory design and UCD principles, together with an Agile development. The participatory design is done with project leads. Aiming to have an MVP in the first three months and starting a new app development after it.
A ‘Night in the Cloud’ – Geoff described the background in TV programming and the noticing that there are plenty of definition of citizen science, but for the Crowd and the Cloud, they use “science for, and by, the people” – and they set the programme to turn viewers to doors. Waleed recalls his interest in science – and he pointed to “earth rising” and the “blue marble” as influential ways of viewing the earth. There is also the power of face to face the perspective of close and personal. There is impressive data – 2.3m volunteers in environmental conservation – $2.5b worth of effort. Rick Bonney pointed that for many years, there was a need to see involvement of television in making citizen science visible, and when Geoff called, and after quick google check which reveal the involvement in Cosmos, he contacted him back to support the process of the programme. The programme also helped with EyesOnAltz that address the analysis of vessels in a video. The visibility of the project on the Crowd and the Cloud has helped in increasing participation. Waleed was noticing the commitments and interest of participants and enthusiasm and connection to the environment. The best way for high-quality data is to care passionately about what they are measuring. Jennifer Shirk – used resources from the crowd and the cloud to create a programme for out of school activities. The link to SciStarter helped in converting viewers to active participants and Waleed was struck by the commitment and passion of participants and their commitment to producing real science of high quality. The close and personal perspective is important to understand the world and the potential of it.
Below are the clips that were prepared by the crown and the cloud – the second shows the late Gill Conquest