Citizen Science 2019: Environmental Justice and Community Science: A Social Movement for Inpowerment, Compliance, and Action

DSCN3340The session was opened by Na’Taki Osborne-Jelks, Agnes Scott College (CSA board) – the environmental justice movement have used methods of community science we need to include in the tent of citizen science. There are 60 participants in the conference that are supported by the NSF to participate in the conference. There was a special effort to ensure that Environmental Justice is represented in the conference.

Ellen McCallie (NSF), which provided a grant to support EJ activists to join the conference, noted that the NSF Includes got a specific focus on those that are under-represented in STEM and that are underserved by NSF projects. There are about 150 projects by NSF that include citizen science and crowdsourcing, and all of them push boundaries in knowledge or help people to learn about science.

The panel was moderated by Sacoby M. Wilson, Community Engagement, Environmental Justice, and Health (CEEJH), University of Maryland-College Park. The chair set three questions:

First question: how you got into citizen science/community science?

Second question: what were some successes?

Third question: what your message to the CSA?

Panellists:

Viola “Vi” Waghiyi, Alaska Community Action on Toxics (ACAT), St. Lawrence Island, Alaska.

Located in the north Baring Sea, shes have 4 boys and the community. They are close to Siberia, and the Air Force established two bases in the Cold War. The people in the area continue to leave in the land and they wanted to keep the way of life and not separate themselves from the land and sea. It’s an Island the size of Puerto Rico, but TB, starvation and other impact reduce them to 1500 people. The bases established at each end and stay there from 1940 to 1970, and the contamination impacted cancer and health defect. They were ignored about the impacts and pleaded to help. An executive who was a scientist and they started a community based participatory research and they know that they have a higher PCB and one of the most contaminated community because they rely on traditional food – chemical releases end in their environment without chemical factories. They have a crisis in their community. She took a position to learn about chemicals and the impact on her people and been doing it for 17 years – taking samples, doing research, train local people.

Success – the institutional barriers that a small non-profit has challenges in addressing the PCB and the state is pro=developement of energy sources. So the state agencies don’t look after marginalised communities. There are also issues of funding, with a refusal of funding as their expertise are not valuable. The success – there are so many chemicals that are being created and all that impact you and your body. Companies that don’t take human health into account. The indigenous group is part of the human right convention and trained to use their voice to influence the process – work at the international level helps everyone.

Traditional knowledge, song, dances, creation stories, and we need to have sound data that scientists need to use to help communities in health and disparities.

Margaret L Gordon, West Oakland Environmental Indicators Project, Oakland, CA – dealing with dirty diesel project. Connected her community to improve the air near the port for over 25 years

How got to the field? Got involved in citizen science because she got tired of the state agencies and local agencies and lack of response. The organise to demonstrate that the city, the county, to EPA to demonstrate that they can collect information and measure their own air quality. They started in 2008 in Oakland and Berkeley, and researcher came to them. They started to use dust measurement, and a community measurement technician and really understood how to use the equipment and keep it accurate.

Success – part of creating an equitable solution, and problem-solving mechanisms to solve the issues. An understanding of problem-solving and bring people from the city, but need an equitable process and she was also the board of the port of Oakland and that was useful to address issues. Some people in citizen science, who didn’t learn how to be community engaged should not come to communities – they had to teach researchers how to work with them, and there are also issues with universities who want to collaborate and don’t share funding with community organisations. Relationship of trust and good communication can work.

We need cumulative impacts that need to be carried out in impacted communities and there is not enough academic research in the communities that are exposed to pollution. Better impact science.

Question about Climate change: we need to talk about Climate justice, and that need to be discussed about the impact on poor communities to deal with floods, and other impacts.

Omega R. Wilson, West End Revitalization Association (WERA), Mebane, NC – doing a Community Owned and Managed Research – the gold standard for community science.

EJ movement and activity started 70 years ago (he is 69), before they were bord – it was passed from their mothers. Issues of toxic free soil, good water, good air – there is a continuum. Moved after university to the Mississippi and in NC develop a new understanding of EJ issues and with the support of NIH helped to develop research in the area of North Carolina.

Successes – community groups deserve recognition in books and publications. There were intimidations of family members of activists by state officials. The use of the law is a way to get things working and to achieve.

The Citizen Science Association should be about dealing with problems, not just studying them. Push universities to actually fund pollutants use – the CSA should encourage growing education of Hispanic, Black and Indigenous groups education in science. The association need to support where there are getting the resources. Science for people, as science for action.

The issues are about terminology and changing citizen science and use community-based science and community-based research: everyone has a right to clean water.

Vincent Martin, Community Organizer Against Petroleum Refineries, Detroit, MI – push issues of air quality around Detroit and active at the national level. Got his company to assist the community with EJ issues.

The basic right for air, water, and climate change will get worse in poor communities. His community got coals, roads and highways, and a lot of hazardous material is released to their community.  When they started all the “white crosses” on a map of each person that died from an environmental related disease was unbearable and they had to stop. Experienced that with a brother who died from that impact. There was a proposal to bring Tar Sands for processing to their area, and the pointed that the zoning laws are incorect, and that was ignored – but then when the authorities check, they show that this was correct but the city authorities approved the expansion. Started to learn about toxics and about issues and how communities are being treated in such a situation. The community need to provide oversight and “hey, we don’t want that” and get some transparency and equity.

Beverly Wright, Deep South Center for Environmental Justice and Dillard University, New Orleans, LA – influenced at national policy and influenced EJ issues nationally.

Got her PhD in Buffalo, as a sociologist working with the trauma of the love canal and the impact on the community. In the Missisipy Industrial Corridor they could see the chemical impact on the community and while people could see the evidence on fish, insect, and because there was only one chemical at the time, they couldn’t show link. In a community that she worked with, they took their own sample. Fell into citizen science through “we don’t trust you” and recruited toxicologist, and set out environmental sociology to work with a community. They create the first GIS map that shows the spatial distribution by race and income to TRI facilities and there were clusters of black communities.

Success – one of the only PhDs that are not being kicked out of community meetings. They made a community university model in 1992 and they use that model for a Community University Partnership by the EPA. Louisiana there were issues of working with communities – most environmental organisations that are typical (white, middle class – Big Green) bring students from the outside who then go away and don’t leave anything behind. So brought researchers to teach communities how to use the processes and collect data – and that is the creation of the Bucket Brigade. The White Crosses were used to demonstrate strange cancer rates in the chemical corridors. It took 18 years to win a case, but with the effort of the bucket brigade effort and capturing white steam that goes through the community and it was sent to EPA. Once it was captured, the EPA change the approach and organise the community in Diamond Plantation who got funding for relocation.

The level of pollution that is allowed by EPA – permits are set by the first company that was allowed to pollute, and the licences are about poisoning people, in effect.

Science not leading to action – most of the time. Need political science: science and advocacy.

There is an internalised racism and that is real and black people who are working for everybody, and there is an issue that someone is speaking for them. The black people are the only group that was enslaved by this country and that is persistent even in EJ, and other ethnic groups are not supporting black group – e.g. Latinos, Native American etc. It is an issue of racism that carried over to other minorities group. But black people learned to stand for themselves.

Climate change: the EJ movement pushing that the Green New Deal includes justice element and equity, and not to allow carbon trading that will leave pollution to poor communities. Need to think about how to have a just transition to a green economy. That is an effort towards the election of 2020.

Carmen Velez Vega, PhD, MSW, Tenured Full Professor, University of Puerto Rico – Medical Sciences Campus – addressing public health issues, and involved in the recovery of Porto Rico after Hurrican Maria.

Became involved in EJ because before that she was activists in the LGBT: e.g. the same-sex adoption, and that experience opens up other experiences. Puerto Rico is an Environmental Injustice Island – one phenomenon is the same people fighting on everything. As a social worker started to learn and in the school of public health. She was involved in a project that was funded by the NIH and looked at someone to do community engagement with a known researcher, and use the text of Phil Brown and through that, she was exposed to the risk that women in reproductive age are exposed to. There is an issue of contaminated water and toxic products. She learned that not all women are exposed equally – the more poor and brown you are, the more exposed you are. After Hurricane Maria, they were abandoned by the authorities and that added to the injustice. The injustices would not disappear.

The CSA should promote policies that push towards environmental justice and impact at a larger scale. Promoting young people and leaders in the area of environmental justice. Need to work with the communities.

 

 

Esri User Conference – Science Symposium

 

Esri Science Symposium

As part of the Esri User Conference, Dawn Wright, Esri Chief Scientist, organised a Science Symposium that gave an opportunity for those with interest in scientific use of Esri GIS to come together, discuss and meet.

Dawn Wright opened and mentioned that the science symposium is aimed to bring people people from different areas: hydrology, ecology or social sciences – together. The Esri science programme is evolving – and there is official science communication approach. There are different ways to support science including a sabbatical programme. Esri will launch a specific challenge for applications of data sets for students with focus on land, ocean and population. Esri will provide access to all the data that is available and the students are expected to carry out compelling analysis and communicate it. It is an activity in parallel to the global year of understanding. There are also sessions in the AGU meeting that are support by Esri staff.DSCN1690

Margaret Leinen (president, American Geophysical Union) who is working on marine and oceanography gave the main talk ‘what will be necessary to understand and protect the planet…and us?‘. Her talk was aimed at the audience in the conference – people who’s life focus is on data. What is necessary to understand the planet is data and information – it’s the first step of understanding. There are many issues of protecting and understanding the planet – we need to understand planetary impacts on us. The first example is the way we changed our understanding of climate change on the ocean. When we look at the change in sea surface temperature in the 1990 we can see changes up to 2 degrees F. The data was mostly collected in traditional means – measurements along the paths of ships. Through studies from ship records over the years, we have created a view of ocean heating – with different results between groups and researchers with lots of hand crafted compilation of records. In the last decade things have changed: ARGO floats are going up and down through ocean, and make all the data is available – there are 3839 operational floats, reporting every week. This is a completely new way or seeing the data, with huge scale international collaboration. Now we can see the annual cycle and determined the slope in the change in heat content. We have a 10 years time series for the depth of 0-2000m. We have a much more detailed information of the changes. There is an approach to make these devices that will understand the full planetary budget on heat through the whole depth of the ocean. The EarthScope Facilities also provide a demonstration of detailed sensing data – understanding the Earth and it’s movements. Many seismometers that are used for over a decade – the US array provided a massive increase in the resolution of seismic measurements. In 2011, the network identified the Japanese Honshu earthquake. The measurement provided a new class of earthquake modelling that can be used in engineering and science. GPS also provides new abilities to understand deformation o earth. Permanent GPS receivers – many of them – can provide the resolution and accuracy to notice subtle movement, by using very sophisticated statistical filtering. HPWREN – High Performance Wireless Research and Education Network – provide a way to transfer information from sensors who are very remote, then then linked through line of sight communication, and the network provide a reliable and resilient public safety network. The network support many sensing options. There are fire cameras that are linked to it, that alert to provide real time information to the fire department. WiFire is a programme that aim to deliberately work on this issues. GIS data is used to assess surface fuel. In summary: Earth science is going through huge transformation through collaboration of large groups of researchers who are using dense sensing networks. We can now monitor different processes – from short to long term. We gain new insights, and it is rapidly transform into local, regional, national and global responses.

After her talk, a set of responses was organised from a panel, including: Mike Goodchild, John Wilson, Marco Paniho , Ming Tsou, and Cyrus Shahabi.

John: discussion about GIScience – the examples that we’ve seen point to future challenges. We can train people in the spatial sciences, and insist that they’ll learn another area, or change the earth sciences, so people learn about spatial issues, or somewhere in between, with people becoming aware of each other language. Spatial scientists have little capacity to learn a new areas – and same is true for earth scientists. The only viable path is to work together – it’s about working in interdisciplinary teams and enabling people to work with them. Data acquisition is moving fast and it is a challenge to train graduates in this area. Only recently we start thinking about solutions. Academics are experts in dealing with problems in the world, and instead we need to suggest solutions and test them.

Marco: the principle and ideas are problems that are familiar in GIScience although the specific domain of the problem was not familiar. Issues of resolution and scale are familiar in GIScience. We have a long way to go in terms of details of describing a phenomena. We need to see how systematic are we now in acquiring data? We need details of the maps of the heating of the ocean, and understanding what is going on. What is the role of remote sensing in helping us in monitoring global phenomena? We need to think about down-scaling – get from aggregate data to more detailed understanding something locally. What is the role of citizens in providing highly local information on phenomena?

Ming: we need to remembers about ‘how to lie with maps?’ – we need to be very careful about visualisations and cartographic visualisation. Each map is using projections, cartographic representation, and we need to think if it is the appropriate way to ask if that is the appropriate way to present the information? How can we deliver meaningful animation. Cartography is changing fast, but today we need to look at 2000-5000 scale, but we are using now levels and not scale. The networks and models of wildfire are raising questions about which model is appropriate, how many variables we need and which sources of information, as well as the speed of the modelling. Need to think which model is appropriately used.

Cyrius: there are more and more sensors in different context, and with machine learning we have an increased ability to monitor cities. In case of existing models – we have cases of using more data analysis in computer science.

Margaret: we have new ability to move from data, model, analysis and keep the cycle going. In the past, there was gulf between modelling or observations, we don’t see a divide any more and see people going between the modelling and the data.

Discussion points: We need to consider what is the messages that we want to communicate in our maps – we need to embrace other disciplines in improving communication. We need to implement solutions – how much uncertainty you are willing to accept. Every single map or set of data is open and other people can look and change it – this is a profound change.

The earth system is an interrelated system – but we tend to look at specific variables, but data is coming in different resolutions, and details that make it difficult to integrate. Spatial statistics is the way to carry out such integration, the question is how do we achieve that.

It’s not enough to have data as open but the issue is how to allow people to use it – issues of metadata, making it able to talk with other data sets. Esri provide a mechanism to share and address the data.

There is uncomfortable relationships between science and policy – the better the models, there is more complex the issue of discussing them with the public. How to translate decimal points to adjectives for policy making. This creates an issue to communicate with the public and policy makers. There is a need to educate scientists to be able to communicate with the wider public.

Another issue of interdisciplinarity – encouraged as graduate, but not when it come to the profession. There are different paths. Once you land a job, it is up to how you behave and perform.

Considering the pathways of integration, the challenge between the modellers and the observationalists. We can think about identifying a path.

Machine learning might need to re-evaluate how we learn and know something. There is also need to think about which statistics we want to use.

Margaret: what is different now – a growing sense of lack of being able to characterised the things that are going on. The understanding about our ignorance: in the past we had simple linear expectations of understanding. We finding that we don’t understand, and the biosphere and what is does to the world. There are so many viruses in the sea air, and we don’t know what it does to the world. The big revolution is the insights into the complexity of the earth system. How not to simplify beyond the point that we will loose important insight!

Esri User Conference 2016 – plenary day

The main Esri User conference starts with a plenary day, where all the participants (16,000 of them) join together for a set of presentation from 8:30 to 3:30 (with some breaks, of course). Below you’ll find some notes that I took during the day:

wp-1467087487123.jpgThe theme of the keynote was GIS – Enabling a Smarter World. After an inspirational video (emphasising environmental applications of GIS, including dealing with sustainability and biodiversity), Jack Dangermond, opened the conference by covering a range of applications that fall under smart GIS. Examples include environmental monitoring, energy management for renewable energy and grids. Using the management of land information and urban design (green infrastructure plans, corridors for wildlife etc.), transport –  smart routing reduce environmental impacts, and increase efficiency. Engineering and public work, utilities and telecommunication, business analytics (an area that finally is taking off), public safety and also humanitarian support. We have an increasing understanding of citizen engagement through open data, and the UN is using GIS to share open data in data management for the Sustainable Development Goals. Story telling, and story maps are becoming central to the way information is shared.
We’re living in a world that is undergoing a massive digital transformation – how do we go forward in this wired planet? GIS is a language for understanding the world. We need to address the crisis of sustainability – we need to address the problems together. GIS allow integration, visualisation – a framework to design for the future through geodesign. Turn information to action – from measuring to affecting the world. GIS itself is getting smarter – through technologies and tools, sensors, types of data. Smart GIS is a variety of things: ability to connect to real time information – IoT, remote sensing, connecting everyone – assisting communities to understand what they are doing and acting. It mean integrating spatial data and records with system of engagement. This is possible through Web GIS pattern. Earthquake alerting from USGS tell people to get ready, and also flood analytics. There is an emerging ‘Community GIS’

A leading example of this change is the City of Los Angeles GeoHub– Lilian Coral – chief data officer described how she try to ensure that the city is using data for helping the management of the city. To assist with that, they have developed geohub.lacity.org to enable community organisations to do things with city data. It is using open data and open applications to allow new applications to solve problems. From running a clean Street Index to compare the information between different areas. GeoHub helps to unlock data in the city and can provide  support a range of application. People are used for community data collection on Exide Battery Contamination that happened in LA. LA is aiming to reduce death from accidents on the road, and trying to improve performance over time. They even try to explore walking in LA and reduce car dependency. They learn that the GeoHub is foundation for smart cities and develop a range of hubs for generating and using geographic information for residents.

Awp-1467087506737.jpgfter the GeoHub presentation, Jack Dangermond noted that we have an ability to share geographical knowledge like never before.  The concept of ArcGIS evolved to see it as a hub between a system of records, system or analysis, and system of engagement. Growing important of web services and apps. ArcGIS tools are evolving – collector and Survey123 apps are linking to field workers and data collection. In terms of GIS technology, there is more effort on exploratory spatial data analysis tools (Insights for ArcGIS) and making it possible to analyse Big Data – for example billion transactions – using distributed computations using computer clusters. Application such as Drone2Map can speed up the process of turning drone imagery. There are more development tools for apps, with over 500,000 appearing. The open source apps allow people to developing further. Esri has run 4 MOOCs and may learning resources that are free for use by users of Esri. Esri support 11,000 university and higher education institutions around the world.  The people who are working in GIS, engaged and committed, are the people who are creating a smarter and more sustainable world.

wp-1467087511310.jpgLater in the day, some of the technologies that were discussed include the living atlas which is a whole catalogue of updated base maps, and the use of vector data allow restyling of information in many ways. A growing range of apps for the field, office and for the community support a range of activities. Information for communities include story maps, open data, photo survey, crowdsourced reporter, manager, and polling.

An example for the utilisation of the apps was provided by the talk “Civic Responsibility – Changing Our Approach” from the City of New Orleans (Lamar Gardere, Greg Hymel & James Raasch). In New Orleans they used collector to work with volunteers to coordinate and record a progression of a campaign to raise awareness to mosquito that can be vectors of disease. They also created a very fast survey methods based on images of building, using a crowdsourced image analysis that includes 6 attributes. The photos where collected throughout the city using geolocated wide angle camera. They then prepare the images and created a way of capturing information. They ask people to help in crowdsourcing. An example for geographical crowdsourcing in government, with micro tasks: https://propertysurvey.nola.gov/photosurvey/ . They have also created an application to link people relating to basins and reports from 311 calls. When someone agreed to adopt a ‘catch basin’ (a drain in the street) then they are sharing responsibility to check that it is not blocked before storms arrive and volunteer to clean the drain. They also have a story map, to let people share their pictures and images that are integrated into a story map.

wp-1467087515436.jpgAfternoon session opened with the main keynote “The Invention of Nature: Alexander von Humboldt’s New World” by Andrea Wulf. She told the story of Alexander von Humbolt, who spend his fortune on a journey of 5 years in south America, the most famous person in his time after Napoleon. He inspired Darwin to go on the Beagle journey. Many people relate to him and his insights. Died in 1859, and after his death people celebrated him – but he is almost forgotten today. Humboldt invented the concept of Nature, noticing the connection between different aspects of the living world, and geography. He also defined global climate and vegetation zones. Pioneering mapping and visualisation – using scientific data as a basis for fantastic maps. He can also be associated with concepts of environmentalism. Her book explores him and his insights. The journey from Quito to Chimborazo was similar to a journey from the tropic to the arctic, and realised that it’s like movement between different regions of the world. He was capable of linking many things together. Humboldt also created new forms of cartography, and have an appreciation to indigenous knowledge. Humboldt ‘Cosmos’ made a physical description of the universe, linking many aspects of nature together and this was his most popular contribution. The network of GIS and the creation of a living atlases in GIS is knowledge that bring power to people and communities – we can see a link to practices in GIS to von Humboldt.

Another major announcement was the effort of “Designing and Creating a Green Infrastructure” with Arancha Muñoz Criado (City and Strategic Planner) and
Kaitlin Yarnall (National Geographic Society). A common initiative of conservation organisations to create a common set of information about green spaces and wild spaces. Esri and National Geographic are joining forces to create information system for this. The notion is to protect green infrastructure across America – a GIS for the whole country, to define the area that need protection. They will provide extensive information and will provide geodesign tools to allow many people to use the information.

wp-1467087519514.jpgAnother important presentation was about “The AmzonGISnet” with Richard Resl and Domingo Ankuash in Ecuador, who use GIS in new ways. 20 years ago, Domingo started to use GIS to help the indigenous tribes that he leads to protect their lands. Many local indigenous members of the community who have GIS skills and who create a self made life plan – their own atlas representing their land and views. He noted that his community “We do not live in the forest, we are part of it”. The are not thinking themselves are poor, but need the support of other people to protect their land – having maps that are strategic and mindful. Using GIS not to navigate the forest but to protect it.

The final talk in the event was about Connecting GIS with Education, noting that  there is more work on GIS in schools across the US and the world. San Andreas High School started only 18 months ago with GIS, with only one teacher getting into GIS, but alrady achieving results through collaboration with GIS Mentors. An area with 98% students who receive free lunches. The GIS is a force for good. They created a story map about teens and drinking & Alcohol abuse, showing analysis and considerations within the process. Students also created data collection for surveying the state of sidewalks using Survey123.

Esri Education Conference 2016 – day 1

I’ve been working with Geographic Information Systems (GIS) since 1988. During the first 2 years, I wasn’t even aware that what we were doing was GIS – it was a mapping/inventory system that run on second generation PC (80286 processors) and was used to map facilities. Once I’ve discovered that this was a GIS, the next thing was ‘and ESRI Arc/Info is the software that you should check’. I’ve heard a lot over the years about the Esri User Conference, but haven’t had the chance to attend it, so this year I’m filling in the gap in my experience of the world of GIS. I’m giving a keynote at the Esri Education User Conference (EdUC), and I’ll attend some other parts of the general User Conference, and report on the experience.

The Education UC opened up with interest in creativity. Angie Lee, who opened the conference, noted her inspiration to the theme from learning about the makers movement and the growing interest in teaching students to code. She noted that many aspects of GIS encourage creativity: developing a story map or building an app. The opportunity that are emerging with new technologies  This is also true in science for the development of hypotheses and methodologies.

The two keynotes on the first day are by Dave Zaboskiwp-1466874989913.jpg
(Professional Artist / Creative Consultant) a former Senior Animator with Disney. Dave suggest that people have an innate ability of creativity. Creativity that is common at a young age but disappear later. Creativity is the willingness to do try things, it’s the courage to take risks. It’s the act of turning a thought into a thing, but we tend to get lost in the process. Resolute imagination is leading to magical results. The approach that he suggests is highlighting a spiral iteration instead of trying to move directly at a goal. There are five key attributes: first – believe in what they are doing. Have a clear narrative, and trying to reduce noise to increase the signal. You can think of things you have no control about them (A list),  issues that you can influence (B list), and a D+ list of things that make you lighter & happy, that make the ‘signal’ in life. The A list should not considered – someone else is going to take care of that and it will be their D+ list. The B list, can be solved at one thing a day, and then focus on the main issues. Then you can focus on the D+ list. Need to collaborate, and risk in a powerful way – when iterating you need to be confident that you can try again. Need to know that it might require throwing work that you have already done. Need to learn to trust you own abilities. Need to be able to allow instead of cognitive dissonant to have allow for information to have creative confusion until the information organise itself in a way that make sense. Completion is also important: acknowledge that people that were involved, analyse and celebrate. When you can declare that you have completed, you can move to the next challenge. Creators believe, iterate , collaborate, risk, and complete.
wp-1466981055360.jpgThe second keynote was by Dominique Evans-Bye, Science Teacher, Clark Magnet High School. She gave a detailed presentation on teaching GIS in high school through project based learning – including diving and operating ROVs. She’s working with low income students, many are from immigrants families. The first project in 2006 that the students done was to analyse sediments and heavy metals in the Los Angeles Harbour. The students collected samples and then analysed the level of contamination in them, and visualising the information on a GIS. In the next stage, they examined contamination in lobsters and they analysed the tissues, with mass spectrometers. They found high arsenic levels. The students gain confidence, and learn through iteration and use the online tutorials of ArcGIS Online and offline to develop skills and use them to analyse their information. The exploration of the problem lead to new questions and ways to represent the information. The students are doing also applications with the Esri collector app, to understand how the litter can end up in the ocean. Another project involved them analysing albatross  migration. Through classes on environmental GIS which was problem oriented and based on all the skills that they’ve gain in operating the research process itself. The students are collaborating, and see the process from projects where they’ve been involved in data collection to analysis. Students experience collaboration with scuba divers from NOAA and other bodies. The students won awards and scholarships as a result of their effort. There are major benefits for creating a creative learning environment with high school students and allowing them to develop their learning through problem solving.

In an afternoon session, I presented a talk that Patrick Rickles prepared, titled GIS and Citizen Science: Combining Open Source and Esri Technologies. The presentation focused on the way that some of the technologies that are developed in the ExCiteS group, such as GeoKey and Community Maps, can work with Esri technologies. The presentation open by explaining the needs and requirements – the interdisciplinarity of the group, and the type of areas that we work with. He then demonstrate, using the work of Challenging Risk project in Seattle, which looks at participatory methods for community preparedness to earthquake and fire. The context of the project mean that there can be 2 way data sharing – from the community to local government so they can see the information in ArcGIS Online, and information from the open data store can be shared on Community Maps. Several other examples for Esri technologies that are in use are shown.

 

Esri Education User Conference talk: Citizen Science & Geographical Technologies: creativity, learning, and engagement

The slides below are from my keynote talk at the Esri Education User Conference 2016. The conference focused on creativity and its relevant to education and the utilisation of GIS (especially Esri software) at different levels of education.

My talk explored the area of citizen science and extreme citizen science and the way geographical technologies contribute to creativity and learning. As I continue to assume that many of the audience don’t know about citizen science, I start with a review of the field as a way to contextualise what we, as a group, try to do.

[The talk is similar, in parts, to other talks that are captured here on my blog (workshop on theory, practice and policy, standards and recommendation for citizen science, or the current developments in ExCiteS). I’m updating the slides with lessons on what seem to work or not in previous talks. Social media is helpful for that – I can see which points people found most useful/meaningful!]

The talk starts with an historical perspective of citizen science, continue with the societal and technical trends that are at the basis of the current growth in citizen science. Having done that, I’m using a typology that looks at domain (academic discipline), technology, and engagement as a way to introduce examples of citizen science activities. I’m using the trailer for the TV series ‘the Crowd & the Cloud’ to recap the discussions on citizen science activities. I also mention the growth of practitioners community through the Citizen Science Associations.

Next, on this basis, I’m covering the concepts and practices of Extreme Citizen Science – what we do and how. I’m using examples from the work on noise, community resource management and earthquake and fire preparedness to demonstrate the concept.

The last part of the talk focuses specifically on creativity and learning from the Citizen Cyberlab project, and I explain the next steps that we will carry out in the Doing It Together Science project. I complete the talk by giving examples for activities that the audience can do by themselves.

Throughout the talk, I’m showing how Esri technologies are being used in citizen science. It wasn’t difficult to find examples – Esri’s GIS is used in BioBlitzes, Globe at Night, links to OpenStreetMap, and support the work that the ExCiteS group is doing. Survey123 and similar tools can be used to create novel projects and experiment with them. ArcGIS Online will be linked to GeoKey, to allow analysis of community mapping efforts. In short, there is plenty of scope for GIS as an integral part of citizen science projects.

Extreme Citizen Science in Esri ArcNews

The winter edition of Esri ArcNews (which according to Mike Gould of Esri, is printed in as many copies as Forbes) includes an article on the activities of the Extreme Citizen Science group in supporting indigenous groups in mapping. The article highlights the Geographical Information Systems (GIS) aspects of the work, and mentioning many members of the group.

You can read it here: http://www.esri.com/esri-news/arcnews/winter16articles/mapping-indigenous-territories-in-africa

Building Centre – from Mapping to Making

The London based Building Centre organised an evening event – from Mapping to Making –  which looked at the “radical evolution in the making and meaning of maps is influencing creative output. New approaches to data capture and integration – from drones to crowd-sourcing – suggest maps are changing their impact on our working life, particularly in design.”  The event included 5 speakers (including me, on behalf of Mapping for Change) and a short discussion.

Lewis Blackwell of the Building Centre opened the evening by noting that in a dedicated exhibition on visualisation and the city, the Building Centre is looking at new visualisation techniques. He realised that a lot of the visualisations are connected to mapping – it’s circular: mapping can ask and answer questions about the design process of the build environment, and changes in the built environment create new data. The set of talks in the evening is exploring the role of mapping.

Rollo Home, Geospatial Product Development Manager, Ordnance Survey (OS), started by thinking about the OS as the ‘oldest data company in the world‘. The OS thinking of itself as data company – the traditional mapping products that are very familiar represent only 5% of turnover. The history of OS go back to 1746 and William Roy’s work on accurately mapping Britain. The first maps produced in Kent, for the purpose of positioning ordinances. The maps of today, when visualised, look somewhat the same as maps from 1800, but the current maps are in machine readable formats that mean that the underlying information is very different. Demands for mapping changed over the years: Originally for ordinances, then for land information and taxation, and later helping the development of the railways. During WW I & II the OS led many technological innovations – from national grid in 1930s to photogrammetry. In 1973 the first digital maps were produced, and the process was completed in the 1980s. This was, in terms of data structures, still structured as a map. Only in 2000, MasterMap appear with more machine readable format that is updated 10,000 times a day, based on Oracle database (the biggest spatial data in the world) – but it’s not a map. Real world information is modelled to allow for structure and meaning. Ability to answer questions from the database is critical to decision-making. The information in the data can become explicit to many parts of the information – from the area of rear gardens to height of a building. They see developments in the areas of oblique image capture, 3D data, details under the roof, facades and they do a lot of research to develop their future directions – e.g. challenges of capturing data in cloud points. They see data that come from different sources including social media, satellite, UAVs, and official sources. Most of Smart Cities/Transport etc. areas need geospatial information and the OS is moving from mapping to data, and enabling better decisions.

Rita Lambert, Development Planning Unit, UCL. Covered the ReMap Lima project – running since 2012, and looking at marginalised neighbourhoods in the city. The project focused on the questions of what we are mapping and what we are making through representations. Maps contain potential of what might become – we making maps and models that are about ideas, and possibilities for more just cities. The project is collaboration between DPU and CASA at UCL, with 3 NGOs in Lima, and 40 participants from the city. They wanted to explore the political agency of mapping, open up spaces to negotiate outcomes and expand the possibilities of spatial analysis in marginalised areas in a participatory action-learning approach. The use of technology is in the context of very specific theoretical aims. Use of UAV is deliberate to explore their progressive potential. They mapped the historic centre which is overmapped and it is marginalised through over-representation (e.g. using maps to show that it need regeneration) while the periphery is undermapped – large part of the city (50% of the area), and they are marginalised through omission. Maps can act through undermapping or overmapping. Issues are very different – from evictions, lack of services, loss of cultural heritage (people and building) at the centre, while at the informal settlement there are risks, land trafficking, destruction of ecological infrastructure, and lack of coordination between spatial planning between places. The process that they followed include mapping from the sky (with a drone) and mapping from the ground (through participatory mapping using aerial images). The drones provided the imagery in an area that changes rapidly – and the outputs were used in participatory mapping, with the people on the ground deciding what to map and where to map. The results allow to identify eviction through changes to the building that can be observed from above. The mapping process itself was also a mean to strengthen community organisations. The use of 3D visualisation at the centre and at the periphery helped in understanding the risks that are emerging or the changes to their area. Data collection is using both maps and data collection through tools such as EpiCollect+ and community mapping, and also printing 3D models so they can used by discussions and conversations. The work carries on as the local residents continue the work. The conclusion: careful consideration for the use of technology in the context, and mapping from the sky and the ground go hand in hand. Creating these new representation are significant and what is that we are producing. more information at Remaplima.blogspot.co.uk  and learninglima.net

Simon Mabey, Digital Services Lead for City Modelling, Arup. Simon discussed city modelling in Arup – with the moved from visualisation to more sophisticated models. He leads on modelling cities in 3D, since the 1988, when visualisation of future designs was done stitching pieces of paper and photos. The rebuilding of Manchester in the mid 1990s, led to the development of 3D urban modelling, with animations and created an interactive CDROM. This continued to develop the data about Manchester and then shared it with others. The models were used in different ways – from gaming software to online, and trying to find ways to allow people to use it in real world context. Many models are used in interactive displays – e.g. for attracting inward investment. They went on to model many cities across the UK, with different levels of details and area that is covered. They also starting to identify features underground – utilities and the such. Models are kept up to date through collaboration, with clients providing back information about things that they are designing and integrating BIM data. In Sheffield, they also enhance the model through planning of new projects and activities. Models are used to communicate information to other stakeholders – e.g. traffic model outputs, and also do that with pedestrians movement. Using different information to colour code the model (e.g. enregy) or acoustic modelling or flooding. More recently, they move to city analytics, understanding the structure within models – for example understanding solar energy potential with the use and consumption of the building. They find themselves needing information about what utility data exist and that need to be mapped and integrated into their analysis. They also getting mobile phone data to predict trip journeys that people make.

I was the next speaker, on behalf Mapping for Change. I provided the background of Mapping for Change, and the approach that we are using for the mapping. In the context of other talks, which focused on technology, I emphasised that just as we are trying to reach out to people in the places that they use daily and fit the participatory process into their life rhythms, we need to do it in the online environment. That mean that conversations need to go where they are – so linking to facebook, twitter or whatsapp. We should also know that people are using different ways to access information – some will use just their phone, other laptops, and for others we need to think of laptop/desktop environment. In a way, this complicates participatory mapping much more than earlier participatory web mapping systems, when participants were more used to the idea of using multiple websites for different purposes. I also mentioned the need for listening to the people that we work with, and deciding if information should be shown online or not – taking into account what they would like to do with the data. I mentioned the work that involve citizen science (e.g. air quality monitoring) but more generally the ability to collect facts and evidence to deal with a specific issue. Finally, I also used some examples of our new community mapping system, which is based on GeoKey.

The final talk was from Neil Clark, Founder, EYELEVEL. He is from an architectural visualisation company that work in the North East and operate in the built environment area. They are using architectural modelling and us Ordnance Survey data and then position the designs, so they can be rendered accurately. Many of the processes are very expensive and complex. They have developed a tool called EYEVIEW for accurate augmented reality – working on iPad to allow viewing models in real-time. This can cut the costs of producing these models. They use a tripod to make it easier to control. The tool is the outcome of 4 years of development, allow the navigation of the architectural model to move it to overlay with the image. They are aiming at Accurate Visual Representation and they follow the detailed framework that is used in London for this purpose www.eyeviewportal.com

The discussion that follow explored the political nature of information and who is represented and how. A question to OS was how open it will be with the detailed data and while Rollo explained that access to the data is complicated one and it need to be funded. I found myself defending the justification of charging high detailed models by suggesting to imagine a situation where the universal provision of high quality data at national level wasn’t there, and you had to deal with each city data model.

The last discussion point was about the truth in the mapping and the positions that were raised – It about the way that people understand their truth or is there an absolute truth that is captured in models and maps – or represented in 3D visualisations? Interestingly, 3 of the talk assume that there is a way to capture specific aspects of reality (structures, roads, pollution) and model it by numbers, while Rita and I took a more interpretive and culturally led representations.

OpenStreetMap studies (and why VGI not equal OSM)

As far as I can tell, Nelson et al. (2006) ‘Towards development of a high quality public domain global roads database‘ and Taylor & Caquard (2006) Cybercartography: Maps and Mapping in the Information Era are the first peer-reviewed papers that mention OpenStreetMap. Since then, OpenStreetMap has received plenty of academic attention. More ‘conservative’ search engines such as ScienceDirect or Scopus find 286 and 236 peer reviewed papers (respectively) that mention the project. The ACM digital library finds 461 papers in the areas that are relevant to computing and electronics, while Microsoft Academic Research finds only 112. Google Scholar lists over 9000 (!). Even with the most conservative version from Microsoft, we can see an impact on fields ranging from social science to engineering and physics. So lots to be proud of as a major contribution to knowledge beyond producing maps.

Michael Goodchild, in his 2007 paper that started the research into Volunteered Geographic Information (VGI), mentioned OpenStreetMap (OSM), and since then there is a lot of conflation of OSM and VGI. In some recent papers you can find statements such as ‘OpenstreetMap is considered as one of the most successful and popular VGI projects‘ or ‘the most prominent VGI project OpenStreetMap so, at some level, the boundary between the two is being blurred. I’m part of the problem – for example, with the title of my 2010 paper ‘How good is volunteered geographical information? A comparative study of OpenStreetMap and Ordnance Survey datasetsHowever, the more I think about it, the more uncomfortable I am with this equivalence. I feel that the recent line from Neis & Zielstra (2014) is more accurate: ‘One of the most utilized, analyzed and cited VGI-platforms, with an increasing popularity over the past few years, is OpenStreetMap (OSM)‘. I’ll explain why.

Let’s look at the whole area of OpenStreetMap studies. Over the past decade, several types of research paper have emerged.

First, there is a whole set of research projects that use OSM data because it’s easy to use and free to access (in computer vision or even string theory). These studies are not part of ‘OSM studies’ or VGI, as, for them, this is just data to be used.

Edward Betts. CC-By-SA 2.0 via Wikimedia Commons

Second, there are studies about OSM data: quality, evolution of objects and other aspects from researchers such as Peter Mooney, Pascal Neis, Alex Zipf  and many others.

Third, there are studies that also look at the interactions between the contribution and the data – for example, in trying to infer trustworthiness.

Fourth, there are studies that look at the wider societal aspects of OpenStreetMap, with people like Martin Dodge, Chris Perkins, and Jo Gerlach contributing in interesting discussions.

Finally, there are studies of the social practices in OpenStreetMap as a project, with the work of Yu-Wei Lin, Nama Budhathoki, Manuela Schmidt and others.

[Unfortunately, due to academic practices and publication outlets, many of these papers are locked behind paywalls, but thatis another issue… ]

In short, there is a significant body of knowledge regarding the nature of the project, the implications of what it produces, and ways to understand the information that emerges from it. Clearly, we now know that OSM produces good data and are ware of the patterns of contribution. What is also clear is that many of these patterns are specific to OSM. Because of the importance of OSM to so many application areas (including illustrative maps in string theory!) these insights are very important. Some of these insights are expected to also be present in other VGI projects (hence my suggestions for assertions about VGI) but this needs to be done carefully, only when there is evidence from other projects that this is the case. In short, we should avoid conflating VGI and OSM.

Happy 10th Birthday, OpenStreetMap!

Today, OpenStreetMap celebrates 10 years of operation as counted from the date of registration. I’ve heard about the project when it was in early stages, mostly because I knew Steve Coast when I was studying for my Ph.D. at UCL.  As a result, I was also able to secured the first ever research grant that focused on OpenStreetMap (and hence Volunteered Geographic Information – VGI) from the Royal Geographical Society in 2005. A lot can be said about being in the right place at the right time!

OSM Interface, 2006 (source: Nick Black)
OSM Interface, 2006 (source: Nick Black)

Having followed the project during this decade, there is much to reflect on – such as thinking about open research questions, things that the academic literature failed to notice about OSM or the things that we do know about OSM and VGI because of the openness of the project. However, as I was preparing the talk for the INSPIRE conference, I was starting to think about the start dates of OSM (2004), TomTom Map Share (2007), Waze (2008), Google Map Maker (2008).  While there are conceptual and operational differences between these projects, in terms of ‘knowledge-based peer production systems’ they are fairly similar: all rely on large number of contributors, all use both large group of contributors who contribute little, and a much smaller group of committed contributors who do the more complex work, and all are about mapping. Yet, OSM started 3 years before these other crowdsourced mapping projects, and all of them have more contributors than OSM.

Since OSM is described  as ‘Wikipedia of maps‘, the analogy that I was starting to think of was that it’s a bit like a parallel history, in which in 2001, as Wikipedia starts, Encarta and Britannica look at the upstart and set up their own crowdsourcing operations so within 3 years they are up and running. By 2011, Wikipedia continues as a copyright free encyclopedia with sizable community, but Encarta and Britannica have more contributors and more visibility.

Knowing OSM closely, I felt that this is not a fair analogy. While there are some organisational and contribution practices that can be used to claim that ‘it’s the fault of the licence’ or ‘it’s because of the project’s culture’ and therefore justify this, not flattering, analogy to OSM, I sensed that there is something else that should be used to explain what is going on.

TripAdvisor FlorenceThen, during my holiday in Italy, I was enjoying the offline TripAdvisor app for Florence, using OSM for navigation (in contrast to Google Maps which are used in the online app) and an answer emerged. Within OSM community, from the start, there was some tension between the ‘map’ and ‘database’ view of the project. Is it about collecting the data so beautiful maps or is it about building a database that can be used for many applications?

Saying that OSM is about the map mean that the analogy is correct, as it is very similar to Wikipedia – you want to share knowledge, you put it online with a system that allow you to display it quickly with tools that support easy editing the information sharing. If, on the other hand, OSM is about a database, then OSM is about something that is used at the back-end of other applications, a lot like DBMS or Operating System. Although there are tools that help you to do things easily and quickly and check the information that you’ve entered (e.g. displaying the information as a map), the main goal is the building of the back-end.

Maybe a better analogy is to think of OSM as ‘Linux of maps’, which mean that it is an infrastructure project which is expected to have a lot of visibility among the professionals who need it (system managers in the case of Linux, GIS/Geoweb developers for OSM), with a strong community that support and contribute to it. The same way that some tech-savvy people know about Linux, but most people don’t, I suspect that TripAdvisor offline users don’t notice that they use OSM, they are just happy to have a map.

The problem with the Linux analogy is that OSM is more than software – it is indeed a database of information about geography from all over the world (and therefore the Wikipedia analogy has its place). Therefore, it is somewhere in between. In a way, it provide a demonstration for the common claim in GIS circles that ‘spatial is special‘. Geographical information is infrastructure in the same way that operating systems or DBMS are, but in this case it’s not enough to create an empty shell that can be filled-in for the specific instance, but there is a need for a significant amount of base information before you are able to start building your own application with additional information. This is also the philosophical difference that make the licensing issues more complex!

In short, both Linux or Wikipedia analogies are inadequate to capture what OSM is. It has been illuminating and fascinating to follow the project over its first decade,  and may it continue successfully for more decades to come.

Second day of INSPIRE 2014 – open and linked data

Opening geodata is an interesting issue for INSPIRE  directive. INSPIRE was set before the hype of Government 2.0 was growing and pressure on opening data became apparent, so it was not designed with these aspects in mind explicitly. Therefore the way in which the organisations that are implementing INSPIRE are dealing with the provision of open and linked data is bound to bring up interesting challenges.

Dealing with open and linked data was the topic that I followed on the second day of INSPIRE 2014 conference. The notes below are my interpretation of some of the talks.

Tina Svan Colding discussed the Danish attempt to estimate the value (mostly economically) of open geographic data. The study was done in collaboration with Deloitte, and they started with a change theory – expectations that they will see increase demands from existing customers and new ones. The next assumption is that there will be new products, companies and lower prices and then that will lead to efficiency and better decision making across the public and private sector, but also increase transparency to citizens. In short, trying to capture the monetary value with a bit on the side. They have used statistics, interviews with key people in the public and private sector and follow that with a wider survey – all with existing users of data. The number of users of their data increased from 800 users to over 10,000 within a year. The Danish system require users to register to get the data, so this are balk numbers, but they could also contacted them to ask further questions. The new users – many are citizens (66%) and NGO (3%). There are further 6% in the public sector which had access in principle in the past but the accessibility to the data made it more usable to new people in this sector. In the private sector, construction, utilities and many other companies are using the data. The environmental bodies are aiming to use data in new ways to make environmental consultation more engaging to audience (is this is another Deficit Model? assumption that people don’t engage because it’s difficult to access data?). Issues that people experienced are accessibility to users who don’t know that they need to use GIS and other datasets. They also identified requests for further data release. In the public sector, 80% identified potential for saving with the data (though that is the type of expectation that they live within!).

Roope Tervo, from the Finish Meteorological Institute talked about the implementation of open data portal. Their methodology was very much with users in mind and is a nice example of user-centred data application. They hold a lot of data – from meteorological observations to air quality data (of course, it all depends on the role of the institute). They have chose to use WFS download data, with GML as the data format and coverage data in meteorological formats (e.g. grib). He showed that selection of data models (which can be all compatible with the legislation) can have very different outcomes in file size and complexity of parsing the information. Nice to see that they considered user needs – though not formally. They created an open source JavaScript library that make it is to use the data- so go beyond just releasing the data to how it is used. They have API keys that are based on registration. They had to limit the number of requests per day and the same for the view service. After a year, they have 5000 users, and 100,000 data downloads per day and they are increasing. Increasing slowly. They are considering how to help clients with complex data models.

Panagiotis Tziachris was exploring the clash between ‘heavy duty’ and complex INSPIRE standards and the usual light weight approaches that are common in Open Data portal (I think that he intended in the commercial sector that allow some reuse of data). This is a project of 13 Mediterranean regions in Spain, Italy, Slovenia, Montenegro, Greece, Cyprus and Malta. The HOMER project (website http://homerproject.eu/) used different mechanism, including using hackathons to share knowledge and experience between more experienced players and those that are new to the area. They found them to be a good way to share practical knowledge between partners. This is an interesting side of purposeful hackathon within a known people in a project and I think that it can be useful for other cases. Interestingly, from the legal side, they had to go beyond the usual documents that are provided in an EU consortium, and  in order to allow partners to share information they created a memorandum of understanding for the partners as this is needed to deal with IP and similar issues. Also practices of open data – such as CKAN API which is a common one for open data websites were used. They noticed separation between central administration and local or regional administration – the competency of the more local organisations (municipality or region) is sometimes limited because knowledge is elsewhere (in central government) or they are in different stages of implementation and disagreements on releasing the data can arise. Antoehr issue is that open data is sometime provided at regional portals while another organisation at the national level (environment ministry or cadastre body) is the responsible to INSPIRE. The lack of capabilities at different governmental levels is adding to the challenges of setting open data systems. Sometime Open Data legislation are only about the final stage of the process and not abour how to get there, while INPIRE is all about the preparation, and not about the release of data – this also creates mismatching.

Adam Iwaniak discussed how “over-engineering” make the INSPIRE directive inoperable or relevant to users, on the basis of his experience in Poland. He asks “what are the user needs?” and demonstrated it by pointing that after half term of teaching students about the importance of metadata, when it came to actively searching for metadata in an assignment, the students didn’t used any of the specialist portals but just Google. Based on this and similar experiences, he suggested the creation of a thesaurus that describe keywords and features in the products so it allows searching  according to user needs. Of course, the implementation is more complex and therefore he suggests an approach that is working within the semantic web and use RDF definitions. By making the data searchable and index-able in search engines so they can be found. The core message  was to adapt the delivery of information to the way the user is most likely to search it – so metadata is relevant when the producer make sure that a search in Google find it.

Jesus Estrada Vilegas from the SmartOpenData project http://www.smartopendata.eu/ discussed the implementation of some ideas that can work within INSPIRE context while providing open data. In particular, he discussed a Spanish and Portuguese data sharing. Within the project, they are providing access to the data by harmonizing the data and then making it linked data. Not all the data is open, and the focus of their pilot is in agroforestry land management. They are testing delivery of the data in both INSPIRE compliant formats and the internal organisation format to see which is more efficient and useful. INSPIRE is a good point to start developing linked data, but there is also a need to compare it to other ways of linked the data

Massimo Zotti talked about linked open data from earth observations in the context of business activities, since he’s working in a company that provide software for data portals. He explored the business model of open data, INSPIRE and the Copernicus programme. From the data that come from earth observation, we can turn it into information – for example, identifying the part of the soil that get sealed and doesn’t allow the water to be absorbed, or information about forest fires or floods etc. These are the bits of useful information that are needed for decision making. Once there is the information, it is possible to identify increase in land use or other aspects that can inform policy. However, we need to notice that when dealing with open data mean that a lot of work is put into bringing datasets together. The standarisation of data transfer and development of approaches that helps in machine-to-machine analysis are important for this aim. By fusing data they are becoming more useful and relevant to knowledge production process. A dashboard approach to display the information and the processing can help end users to access the linked data ‘cloud’. Standarisation of data is very important to facilitate such automatic analysis, and also having standard ontologies is necessary. From my view, this is not a business model, but a typical one to the operations in the earth observations area where there is a lot of energy spend on justification that it can be useful and important to decision making – but lacking quantification of the effort that is required to go through the process and also the speed in which these can be achieved (will the answer come in time for the decision?). A member of the audience also raised the point that assumption of machine to machine automatic models that will produce valuable information all by themselves is questionable.

Maria Jose Vale talked about the Portuguese experience in delivering open data. The organisation that she works in deal with cadastre and land use information. She was also discussing on activities of the SmartOpenData project. She describe the principles of open data that they considered which are: data must be complete, primary, timely, accessible, processable; data formats must be well known, should be permanence and addressing properly usage costs. For good governance need to know the quality of the data and the reliability of delivery over time. So to have automatic ways for the data that will propagate to users is within these principles. The benefits of open data that she identified are mostly technical but also the economic values (and are mentioned many times – but you need evidence similar to the Danish case to prove it!). The issues or challenges of open data is how to deal with fuzzy data when releasing (my view: tell people that it need cleaning), safety is also important as there are both national and personal issues, financial sustainability for the producers of the data, rates of updates and addressing user and government needs properly. In a case study that she described, they looked at land use and land cover changes to assess changes in river use in a river watershed. They needed about 15 datasets for the analysis, and have used different information from CORINE land cover from different years. For example, they have seen change from forest that change to woodland because of fire. It does influence water quality too. Data interoperability and linking data allow the integrated modelling of the evolution of the watershed.

Francisco Lopez-Pelicer covered the Spanish experience and the PlanetData project http://www.planet-data.eu/ which look at large scale public data management. Specifically looking in a pilot on VGI and Linked data with a background on SDI and INSPIRE. There is big potential, but many GI producers don’t do it yet. The issue is legacy GIS approaches such as WMS and WFS which are standards that are endorsed in INSPIRE, but not necessarily fit into linked data framework. In the work that he was involved in, they try to address complex GI problem with linked data . To do that, they try to convert WMS to a linked data server and do that by adding URI and POST/PUT/DELETE resources. The semantic client see this as a linked data server even through it can be compliant with other standards. To try it they use the open national map as authoritative source and OpenStreetMap as VGI source and release them as linked data. They are exploring how to convert large authoritative GI dataset into linked data and also link it to other sources. They are also using it as an experiment in crowdsourcing platform development – creating a tool that help to assess the quality of each data set. The aim is to do quality experiments and measure data quality trade-offs associated with use of authoritative or crowdsourced information. Their service can behave as both WMS and “Linked Map Server”. The LinkedMap, which is the name of this service, provide the ability to edit the data and explore OpenStreetMap and thegovernment data – they aim to run the experiment in the summer so this can be found at http://linkedmap.unizar.es/. The reason to choose WMS as a delivery standard is due to previous crawl over the web which showed that WMS is the most widely available service, so it assumed to be relevant to users or one that most users can capture.

Paul van Genuchten talked about the GeoCat experience in a range of projects which include support to Environment Canada and other activities. INSPIRE meeting open data can be a clash of cultures and he was highlighting neogeography as the term that he use to describe the open data culture (going back to the neogeo and paleogeo debate which I thought is over and done – but clearly it is relevant in this context). INSPIRE recommend to publish data open and this is important to ensure that it get big potential audience, as well as ‘innovation energy’ that exist among the ‘neogeo’/’open data’ people. The common things within this culture are expectations that APIs are easy to use, clean interfaces etc. But under the hood there are similarities in the way things work. There is a perceived complexity by the community of open data users towards INSPIRE datasets. Many of Open Data people are focused and interested in OpenStreetMap, and also look at companies such as MapBox as a role model, but also formats such as GeoJSON and TopoJSON. Data is versions and managed in git like process. The projection that is very common is web mercator. There are now not only raster tiles, but also vector tiles. So these characteristics of the audience can be used by data providers to provide help in using their data, but also there are intermediaries that deliver the data and convert it to more ‘digestible’ forms. He noted CitySDK by Waag.org which they grab from INSPIRE and then deliver it to users in ways that suite open data practices.He demonstrated the case of Environment Canada where they created a set of files that are suitable for human and machine use.

Ed Parsons finished the set of talks of the day (talk link goo.gl/9uOy5N) , with a talk about multi-channel approach to maximise the benefits of INSPIRE.  He highlighted that it’s not about linked data, although linked data it is part of the solution to make data accessibility. Accessibility always wins online – and people make compromises (e.g. sound quality in CD and Spotify). Google Earth can be seen as a new channel that make things accessible, and while the back-end is not new in technology the ease of access made a big difference. The example of Denmark use of minecraft to release GI is an example of another channel. Notice the change over the past 10 years in video delivery, for example, so the early days of the video delivery was complex and require many steps and expensive software and infrastructure, and this is somewhat comparable to current practice within geographic information. Making things accessible through channels like YouTube and the whole ability around it changed the way video is used, uploaded and consumed, and of course changes in devices (e.g. recording on the phone) made it even easier. Focusing on the aspects of maps themselves, people might want different things that are maps  and not only the latest searchable map that Google provide – e.g. the  administrative map of medieval Denmark, or maps of flood, or something that is specific and not part of general web mapping. In some cases people that are searching for something and you want to give them maps for some queries, and sometime images (as in searching Yosemite trails vs. Yosemite). There are plenty of maps that people find useful, and for that Google now promoting Google Maps Gallery – with tools to upload, manage and display maps. It is also important to consider that mapping information need to be accessible to people who are using mobile devices. The web infrastructure of Google (or ArcGIS Online) provide the scalability to deal with many users and the ability to deliver to different platforms such as mobile. The gallery allows people to brand their maps. Google want to identify authoritative data that comes from official bodies, and then to have additional information that is displayed differently.  But separation of facts and authoritative information from commentary is difficult and that where semantics play an important role. He also noted that Google Maps Engine is just maps – just a visual representation without an aim to provide GIS analysis tools.