If you follow the discussion (search in Twitter for #geothink) you can see how it evolved and which issues were covered.
At one point, I have asked the question:
It is always intriguing and frustrating, at the same time, when a discussion on Twitter is taking its own life and many times move away from the context in which a topic was brought up originally. At the same time, this is the nature of the medium. Here are the answers that came up to this question:
You can see that the only legal expert around said that it’s a tough question, but of course, everyone else shared their (lay) view on the basis of moral judgement and their own worldview and not on legality, and that’s also valuable. The reason I brought the question was that during the discussion, we started exploring the duality in the digital technology area to ownership and responsibility – or rights and obligations. It seem that technology companies are very quick to emphasise ownership (expressed in strong intellectual property right arguments) without responsibility over the consequences of technology use (as expressed in EULAs and the general attitude towards the users). So the nub of the issue for me was about agency. Software does have agency on its own but that doesn’t mean that it absolved the human agents from responsibility over what it is doing (be it software developers or the companies).
In ethics discussions with engineering students, the cases of Ford Pinto or the Thiokol O-rings in the Discovery Shuttle disaster come up as useful examples to explore the responsibility of engineers towards their end users. Ethics exist for GIS – e.g. the code of ethics of URISA, or the material online about ethics for GIS professional and in Esri publication. Somehow, the growth of the geoweb took us backward. The degree to which awareness of ethics is internalised within a discourse of ‘move fast and break things‘, software / hardware development culture of perpetual beta, lack of duty of care, and a search for fast ‘exit’ (and therefore IBG-YBG) make me wonder about which mechanisms we need to put in place to ensure the reintroduction of strong ethical notions into the geoweb. As some of the responses to my question demonstrate, people will accept the changes in societal behaviour and view them as normal…
19 February, 2015
The week that passed was full of citizen science – on Tuesday and Friday the citizen Science Association held its first Board meeting, and with the Citizen Science 2015 conference on Wednesday and Thursday, and to finish it all, on Friday afternoon a short meeting of a new project, Enhancing Informal Learning Through Citizen Science explored the directions that it will take.
After such an intensive week, it takes some time to digest and think through the lessons from the many conversations, presentations and insights that I’ve been exposed to. Here are my main ‘take away’ lessons. The conference itself ended by members of the Board of the Citizen Science Association (CSA) describing their ‘take away’ in short, tweeter messages. which was then followed by other people joining in such as:
In more details, my main observations are about the citizen science research and practice community, and the commitment to inclusive and ethical practice that came up in different sessions and conversations.
It might be my own enthusiasm to the subject, but as in previous meetings and conferences about citizen science, you can feel the buzz during the event, with participants sharing their knowledge with others and building new connections. While there are already familiar faces and the joy of meeting colleagues in the field of citizen science that you already know, there are also many new people who are either exploring the field of citizen science or are active in it, but new to the community of practice around citizen science. As far as I can tell, the conference was welcoming to new participants and the poster session on the first day and the breakfast on the second day provided opportunities to create new connections. It might be because people in this field are used to talk with strangers (e.g. participants in citizen science activities), but that is an aspect that the CSA need to keep in mind to ensure that it stays an open community and not closed one.
Secondly, citizen science is a young, emerging field. Many of the practitioners and researchers are in early stages of their careers, and within research institutions, the funding for the researchers is through research grants (known in academia as ‘soft money‘) as opposed to budgeted and centrally funded positions. Many practitioners are working within tight and limited government budgets. This have an implications on ensuring the funding limitations don’t stop people from publishing in the new journal ‘Citizen Science: Theory and Practice‘ or if they can’t attend the conference they can find information about it in blogs, see a repository of posters that were displayed in the conference or read curated social media outputs about it. More actively, as the CSA done for this meeting, funding should be provided to allow early career researchers to attend.
Third, there is clearly a global community of researchers and practitioners committed to citizen science. Yet, the support and network that they need must be local. The point above about budget limitations reinforce the need for local networks and need for meeting opportunities that are not to expensive to attend and participate in. For me, the value of face to face meetings and discussions is unquestionable (and I would hope that future conferences will be over 3 days to provide more time), and balancing travel, accommodation and budget constraints with the creation of a community of practice is something to grapple with over the coming years. Having a global community and a local one at the same time is one of the challenges of the Citizen Science Association.
Finally, the conference hosted plenty of conversations and discussions about the ethical and inclusive aspects of citizen science (hence my take away above). From discussions about what sort of citizenship is embedded in citizen science, to the need to think carefully on who is impacted through citizen science activities. A tension that came throughout these discussions is the value of expertise – especially scientific – within an activity where citizen scientists are treated respectfully and their knowledge and contributions appreciated. The tension is emphasised by the hierarchical nature of the academic world, with the ‘flatter’ or ‘self organising’ hierarchies that emerge in citizen science projects. I would guess that it is part of what Heidi Ballard calls ‘Questions that Won’t Go Away’ and will need to be negotiated in different projects. What is clear is that even in contributory projects, where the scientists setting the project question, the protocol, and asking participants to help in data collection of analysis, simple hierarchical thinking of the scientist as expert and the participants as ‘laity’ is going to be challenged.
If you want to see other reflections on Citizen Science 2015 conference, see the conference previews from Holly Menninger and Caren Cooper, and post conference reports from Monica Peters, which provides a newcomer view from a New Zealnad, while Kelsey McCutcheon provide an American one. Sarah West for an experienced citizen science researcher view. Tessa Scassa provides a view on Intellectual Property and citizen science, and the center for advancement of informal science education (CAISE) posted a summary of conference and Q&A with CSA. Finally, from the Schoodic Institute, who are the sponsors and hosts of the CSA.
12 February, 2015
San Jose is the location for the first citizen science association meeting, on the 11th and 12th February. The level of enthusiasm to citizen science by researchers and practitioners was palpable, even well before the conference – the conference organising team were coping with an overwhelming number of submissions and abstracts that they needed to fit into a two days programme. In the end, the conference run with 7 parallel sessions, and many posters in the reception session on the first day, as a way to allow as many participants to present their work. As can be imagined, what you read in the rest of this post is just 1/7 of what was going on!
Rick Bonney (who was elected as the treasurer of the association just the day before the conference) started with emphasising the reasons for having the Citizen Science Association (CSA): learning from others, developing synergies between projects and sharing information collaboratively, so we can start to solve wicked problems society is facing by finding the answers that we need, together. He noted that the CSA now got 3000 members, a new journal Citizen Science: Theory and Practice and the conference have almost 650 participants.
Following Rick, Lila Higgins and Alison Young, the joint conference chairs, opened the conference, with Lila promoting the use of the hashtag #WhyICitSci on twitter, to provide a range of reasons why people work in citizen science. Few of those are:
Mine was :
The opening talk ‘A Place in the World – Science, Society, and Reframing the Questions We Ask‘ was given by Chris Filardi of the Center for Biodiversity and Conservation, American Museum of Natural History. Filardi, an evolutionary biologist by training, has been recognising the importance of wide participation in science in understanding our place in the world. He started his career by going to New Guinea to study birds in remote places. Though interaction with villagers in the highlands of New Guinea he learned that science dependent on a sense of purpose and understanding the set of relationship that people have with their natural environment. Over the years, he started realising that many of the datasets that he was using are coming from citizen science, and wouldn’t be there with out the effort of the volunteers. He admits that he is new to the field, but what he noticed the lively discussion around the definition. Once he read Alan Irwin’s citizen science he found the explanation that he liked, which is seeing citizen science as that point where analysis and intervention meet each others – and it was huge personal discovery. As scientists, there is an amazing wealth of knowledge that is born from local systems – social systems that deal with the local environment. But scientists are preaching from their own pedestal . If we engage people in the full life cycle of science, we can get a more meaningful relationship between science and society. There is so much as citizen science and society that were revelationary to him. Citizen Science helps in exposing obvious things. It helps to reframing the questions that we ask – as examples from working with indigenous people on their relationship with the forest, which also linked to the preservation of grizzly bears demonstrate. Citizen Science is a touchstone in linking science and other perspectives – when they are involved in the full life cycle of science, allow to notice pre-existing values and practices that can get to the right results when it’s the community that helps to lead to the wished-for results. Conservation and evidence can help in providing community evidence that will allow them to improve the protection of their environment. By working in participatory way, we can get better results. Citizen Science also reveals risks worth taking – scientists are scared of bringing people who are not trained scientists into scientific projects and should learn to do so. Engaging wider audience in data collection can lead to risks – for example respecting areas that are taboo for local people, which sometimes are protected through such mechanisms, and not insisting on exploring them. Need to consider the costs of insisting on science – carrying out a survey in an area that the community decided not to go to because of their beliefs. Science in the name of evidence can harm relationships – we need to know when science need to step back. Citizen Science can link talk, action and symbol – so the social discourse, the actions and beliefs and help in dealing with some of the challenges that we have as a society. Sometimes there are risks for the scientists itself – stating that there was a compromise the scientific process because of local beliefs can be problematic amongst scientists. The discussion that followed the talk also led to mentioning participatory action research and participatory science as names for the topic.
The second session that I attended was 1D Re-Imagining Citizen Science for Knowledge Justice — A Dialogue with Tom Wakeford, Alan Irwin, Erinma Ochu, Michel Pimbert, and Cindy Regalado. The session was organised as a dialogue in groups of 8, with several groups in the room and linking. The dialogue run as a facilitated discussion of the Questions That Won’t Go Away (QTWGA) in participatory action research which Heidi Ballard recently identified – and how to either solve them or learn to live with them. The groups explored the tensions and challenges in the practice of citizen science, and how these have been resolved, or lived with. This was follow by imagining what the vision for citizen science should look like. The group that I found myself impromptu facilitating identified data handling which includes understanding quality, sharing of information, ownership of data and ethics as a major issue in citizen science that won’t go away. Other groups challenged the term ‘citizen’ both from scientists and participants perspectives . The need to understand the ‘citizen’ side demands respect to people financial resources and thinking about compensations – are we treating participants properly and going beyond free labour. Another issue is how expertise are defined – different types of expertise from scientists and participants, and how they can be negotiated. For example, when research is done with communities, it is important not to further stigmatise communities and also to feel obliged to provide information back.
A second round explored the vision for the future citizen science. In my group concepts of place-based citizen science or in the medical field, a disease-based citizen science were proposed – a lot of attention on community focused and based research with issues that are set by the community. There was also wish for truly collaborative citizen science with decision makers, industry and scientists. An idea that was raised is to educate natural scientists to do citizen science as part of their training and also seamless collaborations with the data being properly used and citizen science that is funded for long terms.
The next session that I attended was 2G Talks: Tackling Grand Challenges and Everyday Problems with Citizen Science. The first talk was by Gianfranco Gliozzo from ExCiteS , titled Using Citizen Science to evaluate the cultural value of biodiversity (co-authored with Elizabeth Boakes, David Roy, myself, and Chloe Smith). Gianfranco described a project that is funded by UCL grand Challenges of Sustainable Cities programme. The study looked at cultural ecosystem services – the inspiration from nature that people receive and influence their wellbeing. The approach is especially focused on learning from citizen science data about the cultural services from the environment. The project is specifically looking at data in greater London, which have almost 50% of the total area as vegetated space. The data is from iSpot, iRecord and GiGL. So far, they found that there is emphasis on birds and flowering plants in terms of taxa. The study also looked at spatial patterns and starting to identify the heterogeneity in data collection with some hot spots of more activity. The conclusion so far is that there is a lot of value in integrating citizen science data and understanding the patterns but this is challenging, and also to appreciate the diversity of data sources and their contribution to the total information.
Karen James discussed Combining Citizen Science and DNA-Assisted Species Identification to Enable “A New Kind of Ecology”. Karen opened by explaining that there is a challenge to identify the taxonomic classification of a species during citizen science activities. There are tools such as leafsnap (to recognise patterns of leaves) or wildlife acoustics tools that help in the process, but identification and classification remain very challenging. She specifically focused on DNA barcoding which allow to extend expertise – the barcode of life is a website that is dedicated to this. DNA identification provide further validation. There is an effort to create a library of DNA sequences of species. She demonstrated the potential of identification by using DNA of invasive species. She also see potential in engaging DIY bio enthusiasts in doing this work.
John Tweddle talk Beyond Transcription: Realising the Research Potential of Museum Specimens Through Citizen Science (co-authored with Mark Spencer, Lucy Robinson) discussed work at the Natural History Museum (NHM) in London. NHM have extensive engagement in citizen science, from molecular biology to field work. John focused on unlocking the collection of the museum – they have 3 billion specimens and metadata for it. It’s a treasure trove of information – and most of it is locked away. There is an easy way to take pictures of specimens, but the metadata is hand written, so there is an interest in crowdsourcing of preparing the data for further analysis. However, there is so much more that volunteers can do, beyond the transcriptions. John suggested to move forward – to engage participants in measurements, and further information from the digitized specimens. There is also a potential to add place-based knowledge to enhance the information in the collection, and then design new projects and enhanced them. He used an example from the Robert Pocock Herbarium project started by amateur historians, but once they came to the museum, they got engaged in classification, so ended with a community led project for backtracking where the specimens were collected, and add contextual information.
Alison Young talk Acting Locally and Thinking Globally: Building Regional Community around Citizen Science to Broaden Impacts and to Create a Scalable Model (with Rebecca Johnson) covered the work of the California Academy of Science (CAS) is focus on biodiversity and is focused on research, with 45 million specimens. CAS considered how they can engage citizen scientists in the same way that they work with researchers – they aim that their citizen science will be used for both research and for managing biodiversity. They started with Mount Tamalpais, with an aim of creating a benchmark and record the biodiversity in many ways, including adding specimens to the herbarium. They are defining their community as the citizen scientists, those that might want to use the data (scientists and government), practitioners and organisations and groups that are doing related work. They are relying on iNaturalist as part of their engagement plan and consider also grass-root bioblitz that people can do it more easily then full ones. The ability of people to come and document species with iNaturalist in something close by is valuable, and people engage in a short exercise of just few hours. Nerds for nature are helping in establishing rapid bioblitzes.
As part of session 3A Speed Talks – Across Conference Themes, Simon Lambert (Lincoln University) covered Indigenous Peoples as Citizen Scientists. Simon is from New Zealand and talked about Mauri in joint projects. He noted that there is lack of first nation people in many conferences and meetings. He talks about the people from which he came, he worked with Mauri for some years. There is a history of science in which people where treated as specimens and working with them requires to recognise this history and view of science. There are issues at the global level that recognised indigenous people in international agreements such as CBD, TRIPs and UNDRIP so asserting themselves is challenging for indigenous group and sovereignty – saying no to any science project. Good science comes from great politics – inclusive, ethical, acknowledging First Citizens as First Scientists – pushing into the social sciences to affect change.
In session 3G Tackling Grand Challenges and Everyday Problems with Citizen Science Christian Adams covered Google Tools in From the Ground to the Cloud: Groundtruthing Environmental Change (co-authored with Tanya Birch, and Yaw Anokwa). Christian focused on the technology. Data collection in the field – paper got both upsides and downsides. Technology provides a lot of the issues with paper – Open Data Kit or ODK provides the ability to collect data in the field. It is an open source project. ODK got forms, then collecting and then managing and analysing it with Google Map Engine and Google Earth Engine. ODK got a tool to build forms and it also got a sensors framework. ODK aggregate allow to share data in a spreadsheet or in fusion tables which then can be visualised on maps.
The final talk in the session was Public Lab: Open and cooperative structures for community-based environmental health monitoring by Shannon Dosemagen. Shannon covered the work of the Public Laboratory of Open Technology and Science, she looked at the process that Public Lab established to work with community. Shannon described the new tools that were emerging at the time of the BP oil spill in 2010. The analysis of barriers for community based environment science and health – the tools that are used are expensive, they are aimed at expert users to be able to interpret the results. Public Lab try to engage people in the full data life cycle and provide everything that is needed from development of the tools to the use of the results. At the heart of the activities are the social activities. The combination is low-cost hardware + collaborative web software + visual data that can help people to understand them + public archive so everything is accessible + you. They created open space that allow people to share experience. She demonstrated the web tools – first, a collaborative writing efforts and individual research notes that are tagged to bigger bits of information. They encourage people and recognise the contributions that people made. They maintain many emails lists that are localised and place based. They got 65 organisers that integrated the tools of public lab in their area. There is also places – to highlight the local connections. They treat participants as researchers, they also build openness into the process – so the physical link between the balloon and the operator is allowing for social interactions. The barn raising is also valuable event in the public lab calendar and it is about people coming together. Another ways to add value is mainstreaming true accessibility by linking imagery to Google Earth to make it accessible. The also protect openness with viral licensing so share alike licences are central. They also allow local version of hardware and tools. Third of the organisers are associated with higher education institutions. calibration is another issue that public lab are working with research institutions to test the tools.
The final Symposium of the day was 4A DIY Aerial Photography: Civic Science and Small Data for Public Participation and Action, chaired by Shannon Dosemagen, with cases bringing stories of engagement and change ranging from the Los Angeles River (Lila Higgins) and Gulf of Mexico (Scott Eustis), to Uganda (Maria del C Lamadrid) and Palestine-Israel (Hagit Keysar)
The symposium or panel covered the technical aspects of DIY aerial photography. Public lab aim to create DIY, low cost (below <$150) tools that can be used by different communities. The basic components are also provided as set of tools that can be build easily, and tutorial, guides, hand drawn instruction. People can take any design and change it, but ask to share it back and continue to develop it. People use aerial mapping across the world – MapMill is an image sorting site, which is based on good/not image and MapKnitter that allows the integration of images into it. There is attribution to the people who collected, classified and stitched the map. Finally, they use print publications to share the data with other people.
Scott Eustis described his work which is aimed at understanding where people want to manage their wetlands – it allow a way to link the people who know the land deeply. The effort is about 3h field trip and 2h to turn it into useful map. Going out after rain events to record the location of water helps to communicate the authorities about which places are flooded and causing problem. He highlighted the ability to ‘eye ball’ statistics but most of the cases only little information is needed apart from the image. With near infra-red there is also ability to see information about growth of plant and their health.
Lila Higgins describe her additional interest in the Los Angeles river – a lot of people don’t know that there is a river there. The balloon mapping was aimed to increase recognition. The river became invisible – in the early 1900 during storm event it was causing collapse of houses with major property loss. To deal with that, they crease a concrete channel in 1938 – the river was concritsie in the 1960s. Most people see the concrete view, but there are people who saw different option for the river. Activists navigate the river with kayaks through the 51 miles and then EPA declared it as under the clean water act. The aim is to make it shared space. They started to use Google Earth to pick a section of the river to carry out a blue balloon over the river, and started to use the mapping to build a community. They are documenting events that are promoting the use of the river.
Maria del Carmen Lamadrid, looked at eviction stalling mapping in Uganda. The balloon mapping was aimed to assist people who faced a threat of eviction in Feb 2013. Some people where using the area for over 20 years. The market was mapping in November 2012, when the police came to evict the residents, the evidence was used to prove that the place is used in a valuable way. The project asked question about self-representation and how people use tools that can allow the people in the area to control the data collection. Aerial mapping was combined with stills from the ground to tell stories about the area. Land issues are complex in Uganda with tension between different tenure structures, because some official data and tools such as google maps didn’t show the level of use of the market, the balloon mapping provide their rights. In the end, the place was evicted, and they created a map that show the eviction. Although the project was successful, it helped in terms of empowerment and gaining control over the process. They were treated as equal during the process.
Hagit Keysar looked at two use cases in east Jerusalem as an activists and researcher, looking at recording and document human right abuses, so accountability by the communities itself. Of the 60% of Jerusalem population live in east Jerusalem, and of them, 40% are Jewish and 60% Palestinians. There are regular surveillance balloons by the authorities in the area on a regular basis – what’s the role of DIY aerial photography in this context? The Silwan village outside the old city is contested, and a map was created with information activists in the neighbourhood – they wanted to free themselves from dependency on human right organisations or the UN which don’t provide suitable information. By annotating the information by personal stories. The details of the imagery – allowing to have satellites above their own neighbourhood – providing information that Palestinian cannot access due to local restrictions. The second case is in Beit Safafa and the impact of a 6 lane motorway that cut through the neighbourhood, a discriminatory urban planning practice. The community activists use the aerial photograph to explain the issues when he presented the information in different places. The photograph is not a map – it’s a testimony, something real that make a difference. The maps that were provided were not making the damage to the community legible, so it provides a testimony to the damage – for the person who look at the image, they can understand what the abuses are.
The discussion that followed the presentations also highlighted the integration of objective information from the image, and combining that with narrative and stories of the community. It is also important to explore the journey to empowerment – a joint journey to understand how the technology works by participants and the people who are bringing the technology to it. The use of this toolkit make people interested and shared imagination between the person who promotes it and those who are involved. Flying above your local environment is powerful. There is also potential of documenting important temporal moment (e.g. many birds, or flooding)
The final session of the day included the poster session, and provided me a unique opportunity to finally meet Louis Liebenberg and to hear about CyberTracker from the person who led it since 1996.
You can also see another description of the first day at http://microbe.net/2015/02/11/day-1-report-from-citizen-science-2015-conference/
7 February, 2015
The Open University, with support from Nominet Trust and UTC Sheffield have launched the nQuire-it.org website, which seem to have a great potential for running citizen science activities. The nQuire platform allows participants to create science inquiry ‘missions’. It is accompanied by an Android app called Sense-it that exposed all the sensors that are integrated in a smartphone and let you see what they are doing and the values that they are showing.
The process of setting up a project on the nQuire-it site is fairly quick and you can figure it out in few clicks. Then, joining the project that you’ve created on the phone is also fairly simple, and the integration with Google, Facebook and Twitter accounts mean that linking the profiles is quick. Then you can get few friends to start using it, and the Sense-it app let you collect the data and then share it with other participants in the project on the nQuire website. Then participants can comment on the data, ask questions about how it was produced and up or down vote it. All these make nQuire a very suitable place for experimentation with sensors in smartphones and prototyping citizen science activities. It also provides an option for recording geographic location, and it good to see that it’s disabled by default, so the project designer need to actively switch it on.
28 January, 2015
What does addresses got to do with economic theory and political dogma? turn out that quite a lot. As I was looking at the latest press release from the cabinet office, proudly announcing that the government is investing in (yet another) UK address database, I realised that the handling of UK addresses, those deceivingly simple ‘221b Baker St NW1 6XE‘ provide a parable for the stupidity of neoliberalism.
To avoid doubt: this is not about Open Addresses UK. It’s about the systemic failures of the past 20 years.
Also for avoidance of doubt, my views are similar to Richard Murphy about the joy of tax. I see collective action and common investment in national assets through taxation as a wonderful thing, and I don’t mind R&D investment being spent on infrastructure that might fail – it’s true for Beagle 2 as much as it’s true for a national address database. So you won’t see here ‘this is a waste of taxpayers money’. It’s the systemic issues that I question here.
Finally, If I got some specific details of the history of the development wrong – I’m happy to be stand corrected!
The starting point must be to understand what is the point in address database. The best explanation is from one of the top UK experts on this issue – Bob Barr (OBE). Bob identified ‘Core Reference Geographies‘ which have the following characteristics: Definitive; Should be collected and maintained once and used many times; Are Natural monopolies; Have variable value in different applications; and, Have highly elastic demand. We can also call these things ‘Commons‘ because the way we want people to be able to share them while protecting their future – and ideally avoid ‘tragedy of the commons‘.
Addresses are such ‘core reference geography’. Think about all the applications for a single, definitive database of all UK addresses – it can be used to send the post, plan the census, dispatch emergency services, deliver a broadband link to the right property, check for fraud during purchase transactions, and much more. To make sense of the address above, you need to have geographical location, street name and house number and postcode. Ordnance Survey map can be used to set the location, the street name is set by the local authority and the postcode by the Royal Mail. Merge these sources with a few other bits of information and in principle, you can have a definitive set. Do it for the whole country and you have this ‘core reference geography’, which sounds simple…
The story is a bit more complex – as long as information was not digitised and linked, mismatches between addresses from different sources was not a huge problem, but in the mid 1990s, because of the use of digital records and databases, it became important to have a common way to link them. By that time, the Post Office Postal Address File (PAF) became the de facto definitive address database. Actually, it’s been around since the 1970s, used by the Post Office not as a definitive address database, but to serve internal needs of mail delivery. However, in the absence of any other source, people started to using it – for example, in statistical studies (e.g. this paper from 1988). While I can’t find a specific source for the history of PAF, I guess that at some point, it became a product that is shared with other organisations and sold for direct marketing companies and other users. Naturally, it wouldn’t be what you would design as the definitive source if you start all over again, but it was there, and it was good enough, so people used it.
Without raising false nostalgia about the alternatives, imagine that the need for definitive address database happened at a time when all the entities that are responsible for the elements of an address were part of the public sector. There would be plenty of power struggles, feet dragging, probably cross-departmental animosity and all sort of other obstacles. However, as been proven time and again – when it is all inside the sphere of government control, reorganisation is possible. So you could imagine that at the end of the day, you’d get ‘address directorate’ that manage addresses as national commons.
Now, we can get to the core of the story. Let’s look at the definition of neoliberalism that I want to use here. The definition is from a very good article on the Daily Kos that uses the definition ‘Neoliberalism is a free market economic philosophy that favors the deregulation of markets and industries, the diminution of taxes and tariffs, and the privatization of government functions, passing them over to private business.’ In terms of the political dogma that came with it, it is seeing market solutions as the only solution to societal issues. In the UK, this form of thinking started in the 1980s.
By the time that GIS proliferated and the need for a definitive address database became clear, the neoliberal approach was in full gear. The different entities that need to share information in order to create this common address database were pushed out of government and were asked to act in quasi-commercial way, at which point, the people who run them are instructed to maximise the self-interest of the entity and market their products at prices that ‘the market will bare’. However, with no alternatives and necessity to use definitive information, pricing is tricky. In terms of sharing information and creating a common product, such entities started bickering over payments, intellectual property and control. The Ordnance Survey had Address-Point, the Post Office/Royal Mail had the PAF, and while being still de facto datasets, no satisfactory definitive database emerged. You couldn’t get beyond this point as the orgnaisational structure requires each organisation to hold to their ‘property’, so while the need became clearer, the solution was now more difficult.
In the second round, what looks like a good bottom-up approach was proposed. The idea was the local authorities are the best source of information to create a definitive address database (National Land and Property Gazetteer) because they are the closest to the changes on the ground and can manage them. However, we are under neoliberal dogma, so the whole thing need to operate commercially, and you go for a public/private partnership for that. Guess what? It didn’t work.
Third round, you merge the company from the second round with entity from the first round to create another commercial partnership. And you are still stuck, because fundamentally, there is still the demand to control assets in order to sell them in the market.
Fourth and something that deserve as the most idiotic step in the story is the privatisation of the Royal Mail, which need to maintain ‘assets’ in order to be ‘attractive for investors’ so you sell the PAF with it. It all work within neoliberal logic but the implications is that instead of just dealing with a network of public owned bodies which it is possible to dictate what they should do, you now have it in the private sector, where intellectual property is sacred.
In the final stage, you think: oh, I got a solution, let’s create a new entity that will crowdsource/reuse open data, however, you are a good neoliberal and you therefore ask it to come up with a business model. This time it will surely work, ignoring the huge effort to build business models and all the effort that was invested into trying to pay for a sustainable address databases in the past 20 years. This time it’s going to work.
Let’s ask then, if we do believe in markets so much, should we expect to see a competitor address database to PAF/Address-Point/NLPG appearing by now? Here we can argue that it’s an example for ‘market failure‘ – the most obvious kind is when you can see lack of investment or interest from ‘participants in the market’ to even start trading.
If indeed it was all about free markets and private entrepreneurial spirit, you might expect to see several database providers competing with one another, until, eventually, one or two will become the dominant (the ‘natural monopoly’ above) and everyone use their services. Building such a database in the era of crowdsourcing should be possible. Just like with the early days of OpenStreetMap, you don’t want ‘contamination’ by copying information from a source that holds database rights or copyright over the information that you use. So we want cases of people voluntarily typing in their addresses, while the provider collate the raw data. Inherently, the same way that Google crowdsource queries because people are typing it and giving the text to Google for use, so does anyone who type their delivery address in Amazon.co.uk. This is crowdsourced addresses – not copied from an external dataset, so even if, for the aim of error checking the entry is tested against PAF, they are not derivatives. Take all these addresses, clean and organise them, and you should have a PAF competitor that was created by your clients.
So Amazon is already an obvious candidate for creating it from ‘passive crowdsourcing’ as a side effect of their day to day operations. Who else might have a database that came from people inputting addresses in the UK to a degree that the body can create a fairly good address database? It doesn’t take a lot of thinking to realise that there are plenty. Companies that are operating at a scale like Amazon probably got a very high percentage of addresses in the UK. I’d guess that also Experian will have it for their credit checks, and Landmark is in a very good place because of all the property searches. You can surely come with many more. None of these companies is offering a competition to PAF, so that tells you that commercially, no private sector company is willing to take the risk and innovate with a product. That’s understandable, as there is the litigation risk from all the messy group of quasi-public and private bodies that see addresses as their intellectual property. The end result: there is private sector provision of address database.
And all the while, nobody is daring to think about nationalising the database, force, by regulation and law that all these quasi-commercial bodies work together regardless of their ways of thinking. And it’s not that nationalisation is impossible – just check how miraculously Circle Healthcare is ‘exit private contract‘ (because the word nationalisation is prohibited in neoliberal dogma).
To avoid trolling from open data advocates: I wish the best to Open Addresses UK. I think that it’s a super tough task and it will be great to see how it evolves. If, like OSM, one of the companies that can crowdsource addresses can give them their dirty data, it is possible that they build a database fast. This post is not a criticism of Open Address UK, but all the neolibral dogmatic people who can’t simply go for the most obvious solution: take the PAF out of Royal Mail and give it to Open Addresses. Considering the underselling of the shares, there is an absolute financial justification to do so, but that’s why I pointed the sanctity of private companies assets…
So the end result: huge investment by government, failing again and again (and again) because they insist on neoliberal solutions instead of the obvious treatment of commons – hold them by government and fund them properly.
16 January, 2015
Thanks to invitations from UNIGIS and Edinburgh Earth Observatory / AGI Scotland, I had an opportunity to reflect on how Geographic Information Science (GIScience) can contribute to citizen science, and what citizen science can contribute to GIScience.
Despite the fact that it’s 8 years since the term Volunteers Geographic Information (VGI) was coined, I didn’t assume that all the audience is aware of how it came about or the range of sources of VGI. I also didn’t assume knowledge of citizen science, which is far less familiar term for a GIScience audience. Therefore, before going into a discussion about the relationship between the two areas, I opened with a short introduction to both, starting with VGI, and then moving to citizen science. After introduction to the two areas, I’m suggesting the relationships between them – there are types of citizen science that are overlapping VGI – biological recording and environmental observations, as well as community (or civic) science, while other types, such as volunteer thinking includes many projects that are non-geographical (think EyeWire or Galaxy Zoo).
However, I don’t just list a catalogue of VGI and citizen science activities. Personally, I found trends a useful way to make sense of what happen. I’ve learned that from the writing of Thomas Friedman, who used it in several of his books to help the reader understand where the changes that he covers came from. Trends are, of course, speculative, as it is very difficult to demonstrate causality or to be certain about the contribution of each trends to the end result. With these caveats in mind, there are several technological and societal trends that I used in the talk to explain how VGI (and the VGI element of citizen science) came from.
Of all these trends, I keep coming back to one technical and one societal that I see as critical. The removal of selective availability of GPS in May 2000 is my top technical change, as the cascading effect from it led to the deluge of good enough location data which is behind VGI and citizen science. On the societal side, it is the Flynn effect as a signifier of the educational shift in the past 50 years that explains how the ability to participate in scientific projects have increased.
In terms of the reciprocal contributions between the fields, I suggest the following:
GIScience can support citizen science by considering data quality assurance methods that are emerging in VGI, there are also plenty of Spatial Analysis methods that take into account heterogeneity and therefore useful for citizen science data. The areas of geovisualisation and human-computer interaction studies in GIS can assist in developing more effective and useful applications for citizen scientists and people who use their data. There is also plenty to do in considering semantics, ontologies, interoperability and standards. Finally, since critical GIScientists have been looking for a long time into the societal aspects of geographical technologies such as privacy, trust, inclusiveness, and empowerment, they have plenty to contribute to citizen science activities in how to do them in more participatory ways.
On the other hand, citizen science can contribute to GIScience, and especially VGI research, in several ways. First, citizen science can demonstrate longevity of VGI data sources with some projects going back hundreds of years. It provides challenging datasets in terms of their complexity, ontology, heterogeneity and size. It can bring questions about Scale and how to deal with large, medium and local activities, while merging them to a coherent dataset. It also provide opportunities for GIScientists to contribute to critical societal issues such as climate change adaptation or biodiversity loss. It provides some of the most interesting usability challenges such as tools for non-literate users, and finally, plenty of opportunities for interdisciplinary collaborations.
The slides from the talk are available below.