Notes for the workshop “Open Science by Design: Practical Commitments for Implementation by (Young) Universities – New Indicators — FAIR Data — Citizen Science” this is part of the series “Focus on Open Science” and took place at Universidad Carlos III de Madrid on 8th July 2019.
The opening was provided by Prof. Juan José Vaquero. Vice-President of Scientific Policy. Universidad Carlos III de Madrid. It is a relatively new university – 30 years old. There are networks of young universities in Europe. As a European university, they have to be an Open Science university – need the facilitate the systematic change of democratic and useful science that answer societal needs. They are following the LERU roadmap for culture change at the universities. They have a working group that works on the implementation of open science at the university. They see value in including in the workshop early career researchers so they learn about the integration of open science from the start.
Cecilia Cabello. Director of Open and International Science. FECYT (Spanish Foundation for Science and Technology). FECYT is 18 years in existence and act as an interface in as a science foundation. They strategic plan – increase the value of science – bringing science close to citizens, and increasing the value of open science. Citizens should appreciate science and participate in science – engaging in citizen science. The open science work on repositories, data, and dialogue with different actors across Spain that will make open science a reality: including the ministry, experts on legal issues, and links to the EC. It’s not an easy task – require national and international policy and need the laws and approaches from other places fit the Spanish context.
Dr. Tiberius Ignat. Director of Scientific Knowledge Services – this is the 17th workshop in the series across Europe. Open Science matters – there are two major concerns: nature, and society. Digital societies are becoming part of our life, and we have all sort of advantages – the costs of free. With the Surveillance Economy, we pay for free services by disclosing details on ourselves, and we are also manipulated with persuasive technologies. We need to provide a move to research integrity, to ensure that citizens are involved in shaping their future, to consider data, skills and education, scholarly communication, indicators and so on. There is a diversity of events across Europe. Open Science is not a big movement enough and needs to grow – we need a community for a chance to change something. There is also a LinkedIn group “Focus on Open Science”.
Dr Eva Méndez. Chair of the OSPP. Deputy vice-president for Research Policy. Open Science. UC3M. The focus on Open Science include 8 challenges, components or pillars – there is a need to consider the pillars that are not naturally linked: indicators, citizen science, and FAIR data. Between these pillars there are plenty of questions – should we assume all citizen science data be FAIR? Will the data provide indicators? etc.
Dr Paul Ayris, Pro-Vice-Provost University College London, UK: Leading the change to Open Science in European Universities. Using UCL and network to learn about open science. Also looking at the LERU road maps and learn about different aspects. The LERU road map talks about the 8 pillars of open science. Focus on Future of Scholarly Communication, the European Open Science Cloud, and about Rewards. The roadmap of LERU launched in June 2018 and includes 41 recommendations on what a university needs to do and was signed by the rectors of these universities. Some of these recommendations evolved at UCL – a highly research-active university, so publishing is the lifeblood of the universities. There are open access calls in Plan S and statements from LERU. While there is a critique of Plan S, there is no rejection of open access in general. LERU rectors agree that open access is the way to go – rather than the messy and expensive current model. In the wake of Plan S, there is a need to ensure that academics are aware of OA compliance requirements. There is an OAI11 workshop in Geneva report that can be useful. At UCL this move to a Green OA repositories that already working. UCL established an alternative platform in 2015 with monographs and then evolved into journals – following academics demands and needs. After discussions with different people, setting UCL Press was a way to assist – an area that might fall over – it’s expensive, the number of books that are being sold is very small. Published 106 monographs with 2m downloads in 231 countries. The books are also shared on JSTOR and help in the download figures. Some of the most downloaded books is an output from the ERC project of Danny Miller in Anthropology with 320k downloads. A research monograph sells 200 worldwide, and the arts and humanities are concerned about the change to this model – in contrast, open access demonstrates that monograph publishing is easy and are providing huge readership. The scholars who involved in monograph publishing might be against Plan S but for OA. Dublin City University is the first to buy white label services from UCL Press. All outputs are branded as Dublin City Univesity Press – the download figures won them over.
Other research areas that are covered in the pillars: in research data, there is free of access and use or restricted use. UCL established a research data repository where academics can deposit data that is useful for reuse and it was launched in May. Data that is not sensitive is shared on a system that is based on Figshare. Authors in UCL Press will be able to store their data in this repository. There is a need to provide the tools and services to support academics in the movement to open science.
Finally, there is a need to support rewards and evaluation to academics. The Plan S implementation – we need a large, significant and determined consensus on new ways to evaluate research and researchers (Bernard Rentier of the University of Liege). He identified 23 criteria for a rounded evaluate. In the traditional way people pointing to journals as a mark of quality and success. In the new evaluation, the publications are seen as part of many other measures that are being assessed. There is also the San Francisco Declaration on Research Assessment and the Leiden Manifesto that are being assessed at UCL. We follow the approach that not use numerical factors – e.g. Impact Factor and reduce papers for assessment which are then read. Use bibliometrics only as an adjunct to qualitative assessment and base assessment on a qualitative evaluation of an individual. The open science principles are also included in the career frameworks for researchers. Open Science principles are integrated into the framework of the academic promotion – it was accepted by UCL academic board.
LERU is committed to Open Science and Open Access, but doubtful if Plan S implementation – need to manage the costs and how to avoid escalation of costs out of control. The mechanisms for subscriptions to publishing are yet to be structured so it can be manageable for research-intensive universities.
Open Science is good for researchers. UCL see it as an opportunity, not a threat.
Dr Rebecca Lawrence, F1000: Shifting the research assessment system to enable the adoption of open knowledge practices. Rebecca is managing the F1000 research and other initiatives that provide open access platforms for funders. The need to shift the academic evaluation approach to support open science/research. We mean different things by it – the OSPP have a definition (Figure above). The main barriers for open research but it is incredibly challenging to move towards it. The primary focus of the evaluation is the final scholarly outputs, its venue of publication, and ingrained from research to the researchers, to the institution (how league tables are being calculated). There isn’t enough support at ground level – on awareness and understanding: why it is good for them and how. There is also a lack of skillset – e.g. how to make your data FAIR? This is at all levels of researchers career. There is a collective action problem among stakeholders. There is also a lack of infrastructure and funding – to share a wide range of outputs, capturing and integrating metadata, things like the institution, ORCID, etc. There is also a broad range of indicators which are not being used.
To overcome the barriers, we need policies that are conflicting between organisations and funders. We need to provide a clearer about OA in terms of implementation – we need policies that are linked to implementations. We need tools and infrastructure that make it easy for the research community to act in an open research way. Need to maximising reporting and minimise duplication of effort by metadata and interoperability between system – not loading and replicating reporting. We need training, and we need to rethink rewards and incentives – without changing these, we won’t see a move. DORA has over 500 organisations with 12,500 individuals – but not a lot of implementation. DORA started recording good practices. For example, CRUK focus their assessment to 3-5 research achievement that put publications only at the end – not only top publications. FWF also ask for 10 most scholarly/scientific achievement. NIH asks for a clearer bio-sketchers. The University Medical Center in Utrecht involve people from access career stages to establish criteria.
The Open Science Policy Platform includes different stakeholders and provided a recommendation for next-generation metrics and indicators – less about journal and impact factors, to promoting discussion about the quality of the publication and on all sort of outputs. They need to assess and experiment with the validity of new indicators. The ORCID ID is a way to identify researchers, and providing bio-sketches is an area that is being explored. Need to pay attention to reward for encouraging ECRs to move towards open access. Need to provide public and easily accessible information about what is changing to communicate it to researchers. We need to change careers view – not just the narrowing down towards professorship as the ultimate goal of a research career. Ther
Paul Wouters points out that indicator frameworks can lead to unintended consequents or “steering effect” and be careful of tokenistic changes, but a deeper culture change. There are three levels – the scientific system as a whole and infrastructure for this, thinking about organisations, and individuals. There are different tools boxes – open knowledge infrastructure, open knowledge capabilities, capturing open knowledge practices – qualitative and case-studies, and individual level.
The OSPP recommend OS coordinators – to share best practices, to help consistencies. There are starting in the Netherlands, Finland and Ireland. The Dutch example is setting a national platform and different activities. There is a need for both top-down and bottom-up efforts in making a change to open science. The OSPP next steps are working with the implementation initiatives, and coordinate different pilots – at stakeholders, and institutional level, national level, and in domain specific. Need to help with open evaluation and share successes and failures.
Dr Daniel Hook, Digital Science: The Ascent of Open Access. Digital Science is tool providers to enable things that universities need to manage aspects of open science and science in general. Digital Science done a study of the ascent of digital science https://www.digital-science.com/blog/news/the-ascent-of-open-access-report/ – they analysed the situation in 2000 – you see in open access the strength of US, UK, and Japan. With time, you can see different strategies of adopting open access. By 2016, we can see China went from nowhere to being number 2 in the world, and 3 in Open Access. The UK maintained its global position through open access. Assessing the volume of material that is open access, we see the impacts of change. About third of outputs are now OA in 16 years. We can also see a change in citations – 60% of publications that are closed received 52% citations. Open Access Internationally Collaborative research lead to very high (proportionally) citations. At institutional levels – UCL, Oxford, Cambridge and Imperial are the top in open access, and then there is consistent sets: bands of engagement. The cultural nature of it is quite open. In the Spanish system, there is more homogenous adoption of OA.
We need to consider that we need to consider designing reproducibility but in terms of location, shutter speed, and all sort of context. When you log in with ORCID on a lab experiment that can link to context. We can think about capturing context automatically. It. We need layers beyond a publication: layers of data, experiment design, ethical approval. The presentation of research is disaggregated – we will have all sort of other aspects – automated metadata of narrative, links to data, details of peer-review, machine-readable narratives and we can think about a shift from publications to about processes. This is demonstrated in the Dimensions system of Digital Science. Trust is significant – we need to communicate the data is not simplistic.
Ghislain Onestas, Ex Libris: Putting the library at the heart of research Covering the cloud provision of the company. They are considering the Schonfeld (the scholarly kitchen) workflow of the scientific process. We need to navigate the complexity of the research process. This creates complexities for researchers (e.g. dealing with funders and their needs) to libraries (e.g. embargo) and the research office (compliance with Plan S). So they want to offer a system that will make this easier to do on the cloud (Esploro system.
Prof. Barend Mons, Director for the Dutch International Support and Coordination Office GO FAIR and President of CODATA: The Internet for Social Machines. Fair > The Machine Knows what I Mean. The FAIR principles were noted – they seem complicated to many people, so FAIR is about helping machines know what I mean. The point is to make machines capable of helping humans and that is because of the growth in data volume (e.g. in biomedical research). We need of internet for social machines – and people and machines can work effectively together. Articles will start having a minor role (e.g. the seven sins of open science). First, need to think of problems – data that can be used is “re-useless”. FAIR is not a standard – it’s a guiding principle. Open (is not “free as beer”) don’t mix up, and it is expensive – accessible under well-defined conditions. We also need to stop talking about AI and consider it as machine learning – mostly stupid staff. We need to consider stewardship. We need to think about the use of the data for years to come beyond the research period. Data Sharing – instead of that we need to consider visiting – you don’t want to send petabyte, but to go and visit the data. The need for machine-readable data is “fully AI ready” – all about making life easy for machines. The FAIR started in 2014, and published in 2015 and GoFAIR initiative is developing the network under the FAIR principles – Internet of FAIR data and services IFDS. We need data (somewhere), tools (another place and compute ability (so the ability to take the tools, compute ability, and data that can be brought together. We can have distributed learning by VMs and it can learn things from subanalysis – we need a completely new approach for data and analysis. We need to be careful to manage data and metadata – we can record. Complexity is beyond human comprehension and across links between diabetes and Alzheimer and you can find quick links between research areas, filtering thousands of papers.
Systems are evolving through a vision to the explosion, and then go through convergence – so is we have minimal standards, voluntary participation and critical mass. This way it can grow and develop a local version that suits local needs. The final collaboration is in Co-Data. There is a growing community of people who share tools and approaches and cross support, which mean the development of bottom-up standards and approaches. There is a growing investment in open data – pulling data infrastructure together will help. There is a need to be a system that supports research stewardships and digital competence centre in universities. Humanities also need to be included in FAIR and data stewardship problems are more complex across disciplines. Can consider the “digital twin” of any object – book, butterfly in GBIF etc. which asks questions. We loose 80% of the data within 2 years. We lose $10 billion a year because of the loss in access to data.
Prof. Muki Haklay, Professor of Geographic Information Science at the Department of Geography, University College London (UCL): Citizen Science in Open Science context: measuring and understanding impacts of deeper public participation in science.
My talk started with an overview of citizen science in current practice. I’ve started by pointing to the rapid growth in recognition of citizen science in the past decade, and notice that even in the selective and conservative world of the type of publications that are indexed by Scopus we can see the rapid rise. The two critical trends behind it are the societal changes – an in particular rapid growth in higher education, and technological changes. I then progressed to review major citizen science activities, following the overview of Citizen Science for Earth Observations [5-15], using examples from France (Sauvages de ma rue), mentioning the Boinc effort of Ibercivis in Spain, and different H2020 projects – such as Geo-Wiki and its use in LandSense, or Odour Collect in D-NOSES. The next 3 slides [16-21] pointed to the way specific disciplinary practices and framing of scientific research play out in citizen science projects: from astrophysics to biomedicine and to geography and anthropology – each discipline shapes the projects that are called citizen science within its scope. We also end this part of the talk with a note to the different levels of participation [22]. The next part provides an overview of the policy and practice response [23-28] – the early response of the European Environment Agency, the creation of association at international levels and then at national levels, and the emergence of laws and regulations that explicitly mention citizen science. Following this, I’m pointing to the variety of practices in citizen science, and position it within the wider public engagement framework with the DITOs escalator while also pointing to the scale change in participation between different levels.
Based on this, I introduce the Austrian criteria for citizen science and explains what is wrong with it, and how it is unhelpful to evaluate the field [32]. I show that research demonstrates the multiple goals of citizen science and that each project will not fulfil all of them [33], that we learned about complex learning and creativity patterns [34], and that the logic model of a project like DITOs show complex paths for the public, policy makers, and scientists. I then suggest how to progress carefully with evaluation, suggesting several potential models for funders to consider.
Summary of the day:
Need to consider a common alignment – the future and the direction of travel.
We have existing and emerging scientific publications, and information to modify and rewards systems and career progress.
Existing solutions to have research assets linked. There is a need for a change – but change implies efforts and can’t wait for “someone” to take action. Time is ripe to deal with a change – we’re asking for more money and we need it to be justified.
We need to move to multidisciplinarity – as expressed in the SDGs. We don’t have a system to address them.. need to more awareness dialogue, engagement and building trust. There is a need to build trust with researchers, funders, and the public. There is a need to create opportunities, provide better incentives, and lead to a change in culture.
The questions that were asked in the meeting are at https://app.sli.do/event/umcjyzql/live/questions
The Twitter stream https://twitter.com/search?q=%23os19mad&src=typed_query&f=live