25 November, 2009
Most of the work that we carried out at UCL in evaluating the quality of OpenStreetMap is focused on England, and particularly on London. This is mainly due to the accessibility of comparative datasets. The reason for this was the availability of data, as the Ordnance Survey research unit kindly provided me with the full Meridian 2 dataset for comparison. More detailed comparison, for which we used MasterMap, came from the wonderful Digimap service, though because of the time that it takes to process it we were limited in the size of the area that was used for comparison.
One of the open questions that remained was the accuracy of data collection in other parts of the world. Luckily, Ourania (Rania) Kounadi, who studied our MSc in GIS at UCL, had access to detailed maps of Athens. She used a 1:10,000 map from the Hellenic Military Geographic Service (HGMS) and focused on an area of 25 square kilometres at the centre of the city. The roads were digitised from the HGMS map, and then the Goodchild-Hunter procedure was used to evaluate the positional accuracy.
The results show that for most of the roads in the evaluation area there was an overlap of 69% to 100% between OSM and HGMS datasets. The average overlap was very close to 90%. Her analysis also included attribute and completeness evaluation, showing that the quality is high on these aspects too.
So a pattern is starting to emerge showing that the quality of OSM data is indeed good in terms of positional accuracy. This is surprising at first glance – how come people who are not necessarily trained in geographical data collection and do not use rigorous quality assurance processes produce data that is as good as the authoritative data?
My explanation for this, as I’ve written in my paper about OSM quality, is that it ‘demonstrates the importance of the infrastructure, which is funded by the private and public sector and which allows the volunteers to do their work without significant personal investment. The GPS system and the receivers allow untrained users to automatically acquire their position accurately, and thus simplify the process of gathering geographical information. This is, in a way, the culmination of the process in which highly trained surveyors were replaced by technicians, with the introduction of high-accuracy GPS receivers in the construction and mapping industries over the last decade. The imagery also provides such an infrastructure function – the images were processed, rectified and georeferenced by experts and thus, an OSM volunteer who uses this imagery for digitising benefits from the good positional accuracy which is inherent in the image. So the issue here is not to compare the work of professionals and amateurs, but to understand that the amateurs are actually supported by the codified professionalised infrastructure and develop their skills through engagement with the project.’
Rania’s dissertation is available to download from here.
22 November, 2009
One of the best read that I had over the summer was David MacKay’s Sustainable Energy – without the hot air. The book (which you can download for free from his website) is easy to follow, clear and a very interesting analysis of the options open to the UK in terms of energy provision in a way that is sustainable and without reliance on fossil fuel.
David MacKay’s analysis covers the issue of energy both on the consumption and generation sides. It runs through a whole series of options by using lots of very intelligent and elegant ‘back of envelope’ calculations that show what the reasonable assumptions are for each source of energy and for its use.
What is especially fantastic with this book is the way in which a fairly complex environmental issue is made accessible through several means.
Firstly, the whole book is based on a single measure (kWh/day per person), which is explained clearly up front and then used throughout the book. This makes it easy to compare the different options.
Secondly, the book uses a clear visualisation of stack-bars to show how the different options of consumption and production add up.
Thirdly, the book is made of two parts – an easy–to-access first part, without the detailed scientific references and backing material that would make it difficult to understand, but with enough information to understand how each assertion is made. For interested readers there are technical chapters that provide all the scientific details towards the end of the book.
Altogether, it is a masterpiece of environmental information communication, which is very rare, unfortunately.
14 November, 2009
As part of an update of the work that I published in August 2008, I re-ran the comparison between the OpenStreetMap and Ordnance Survey Meridian 2 datasets. In a future post, I will provide a full report of this assessment. As I have now completed the evaluation for October 2009 and a re-evaluation of the data from March 2008, I decided to publish some outputs. The map below shows the completeness of OpenStreetMap across England for the two periods. Click on the map to enlarge.
The second set of maps show the estimation of completeness when attributes are considered. For this purpose, the calculation takes into account only line objects that are comparable to those in Meridian 2; thus not including features such as footpaths. The following types of roads were used: motorway, motorway_link, primary, primary_link, secondary, secondary_link, trunk, trunk_link, tertiary, tertiary_link, minor, unclassified and residential.
In addition, a test verified that the ‘name’ field is not empty. This is an indication that a street name or road number is included in the attributes of the objects, and thus it can be considered to be complete with basic attributes. In order to make the comparison appropriate, only objects that contain a road name or number in Meridian 2 were included.
The growth within just over a year and a half is very impressive – rising from 27% in March 2008 to 65% in October 2009. When attributes are considered, it has risen from 7% to 25%. Notice that the criteria that I have set for this comparison are stringent than the one in the previous study, so the numbers – especially for the attribute completeness – are lower than those published in August 2008.
7 November, 2009
One of the interesting questions that emerged from the work on the quality of OpenStreetMap (OSM) in particular, and Volunteered Geographical Information (VGI) in general, is the validity of the ‘Linus’ Law’ for this type of information.
The law came from Open Source software development and states that ‘Given enough eyeballs, all bugs are shallow’ (Raymond, 2001, p.19). For mapping, I suggest that this can be translated into the number of contributors that have worked on a given area. The rationale behind it is that if there is only one contributor in an area he or she might inadvertently introduce some errors. For example, they might forget to survey a street or might position a feature in the wrong location. If there are several contributors, they might notice inaccuracies or ‘bugs’ and therefore the more users, the less ‘bugs’.
In my original analysis, I looked only at the number of contributors per square kilometre as a proxy for accuracy, and provided a visualisation of the difference across England.
During the past year, Aamer Ather and Sofia Basiouka looked at this issue, by comparing the positional accuracy of OSM in 125 sq km of London. Aamer carried out a detailed comparison of OSM and the Ordnance Survey MasterMap Integrated Transport Network (ITN) layer. Sofia took the results from his study and divided them for each grid square, so it was possible to calculate an overall value for every cell. The value is the average of the overlap between OSM and OS objects, weighted by the length of the ITN object. The next step was to compare the results to the number of users at each grid square, as calculated from the nodes in the area.
The results show that, above 5 users, there is no clear pattern of improved quality. The graph below provide the details – but the pattern is that the quality, while generally very high, is not dependent on the number of users – so Linus’ Law does not apply to OSM (and probably not to VGI in general).
From looking at OSM data, my hypothesis is that, due to the participation inequality in OSM contribution (some users contribute a lot while others don’t contribute very much), the quality is actually linked to a specific user, and not to the number of users.
Yet, I will qualify the conclusion with the statement that further research is necessary. Firstly, the analysis was carried out in London, so checking what is happening in other parts of the country where different users collected the data is necessary. Secondly, the analysis did not include the interesting range of 1 to 5 users, so it might be the case that there is rapid improvement in quality from 1 to 5 and then it doesn’t matter. Maybe the big change is from 1 to 3? Finally, the analysis focused on positional accuracy, and it is worth exploring the impact of the number of users on completeness.
This is call for papers for a workshop on methods and research techniques that are suitable for geospatial technologies. The workshop is planned for the day before GISRUK 2010, and we are aware of the clashes with the AAG 2010 annual meeting, CHI 2010 and the Ergonomics Society Annual Conference. However, if you would like to contribute to the book that the commission is developing but can’t attend the workshop, please send an abstract and inform us that you can’t attend.
In the near future I’ll publish information about another workshop in March 2010 about the usability and Human-Computer Interaction aspects of geographical information itself – see the report from the Ordnance Survey workshop earlier in 2009.
So here is the full call:
Workshop on Methods and Techniques of Use, User and Usability Research in Geo-information Processing and Dissemination
Tuesday 13 April 2010 at University College London
The Commission on Use and User Issues of the International Cartographic Association (ICA) is currently working on a new handbook specifically addressing the application of user research methods and techniques in the geodomain.
In order to share experiences and interesting case studies a workshop is organized by the Commission, in collaboration with UCL, on the day preceding GISRUK 2010.
CALL FOR PAPERS
While there is growing awareness within the research community on the need to develop usability engineering and use and user research methods that are suitable for geographical and spatial information and systems, to date there is a lack of organized and documented experience in this area.
We therefore invite researchers with recent experience with use, user and usability research in the broad geodomain (cartography, GIS, geovisualization, Location Based Services, geographical information, GeoWeb etc.) to present a paper specifically focusing on the research methods and techniques applied, with an aim to develop the body of knowledge for the domain.
To participate, please send an abstract of 1 page A4 at maximum containing:
- A description of the research method(s) and technique(s) applied
- A short description of the case in which they have been applied
- The overall research framework
- Contact details and affiliation of the author(s)
We are also encouraging PhD researchers to submit paper proposals and share experiences from their research. At the workshop there will be ample time for discussing the application of user research methods and techniques. Good papers may be the basis for contributions to the handbook that is planned for publication in 2011.
Abstracts should be submitted on or before 1 December 2009 to the Chairman of the Commission Corné van Elzakker ( firstname.lastname@example.org )
the website of the ICA Commission on Use and User Issues and the GISRUK2010 website