One of the changes that is currently happening in the area of geographic information in the European Union is the roll-out of the Infrastructure for Spatial Information in the European Community directive (INSPIRE). The text of the directive uses the issue of sharing environmental information as a justification for the creation of a national spatial data infrastructure:
‘Community policy on the environment must aim at a high level of protection taking into account the diversity of situations in the various regions of the Community. Moreover, information, including spatial information, is needed for the formulation and implementation of this policy and other Community policies, which must integrate environmental protection requirements’
Interestingly, this blurring between geographic information and environmental information can be traced back to 1970. Then, in a conference that was dedicated to environment information systems, Roger Tomlinson (who is credited with coining the term Geographic Information System) noted that:
‘The essential difference between most data and those describing the environment of the surface of the earth is that the latter frequently have a location identifier as part of the data element … throughout the symposium the terms “geographical data” and “environmental data” were used synonymously as were the terms “geographic information system” and “environment information system’. (Tomlinson, R. F. (Ed.) (1970) Environment Information Systems, Proceedings of the UNESCO/IGU 1st Symposium on Geographic Information Systems, Ottawa, Canada.
So nothing new – and the confusion between what is environmental information and what is geographic information is bound to continue.
Trying to track down the source of a term is one of the more interesting academic tasks. For example, finding out when people started researching Human-Computer Interaction and GIS is a bit like following the thread. First of all, the term Human-Computer Interaction is sometimes presented as Computer-Human Interaction, especially in the early 1980s, when it emerged – the ACM Special Interest Group still uses CHI and not HCI. Before that, the common term used was Man-Machine Interaction which was actually a term that came out of studies in the 1940s. The way to uncover this terminology chain is to find papers that mention both terms and follow it through. Quite quickly you develop an understanding of the chain…
Then there is the issue of GIS – after all, the term was invented only around the mid 1960s: surely many people outside the small circle of researchers that became familiar with the term used other terminology. So you need to look for other terms, such as geographic information (as well as geographical information), maps, etc.
Following this approach, I have found a paper from 1963 by Malcolm Pivar, Ed Fredkin and Henry Stommel about ‘Computer-Compiled Oceanographic Atlas: an Experiment in Man-Machine Interaction’. The paper is as interesting as its writers – with Pivar and Fredkin among the Artificial Intelligence group at MIT, and Stommel a leading oceanographer. The data came from surveys that were part of the International Geophysical Year (1957/8 ) – and the paper shows that information overload is nothing new.
For me, the most interesting passage in the paper is:
‘[I]n preparing a printed atlas certain irrevocable choices of scale, of map projections, of contour interval, and of type of map (shall we plot temperature at standard depths, or on density surfaces, etc.?) must be made from the vast infinitude of all possible mappings. An atlas-like representation, generated by digital computer and displayed upon a cathode-ray screen, enables the oceanographer to modify these choices at will. Only a high-speed computer has the capacity and speed to follow the quickly shifting demands and questions of a human mind exploring a large field of numbers. The ideal computer-compiled oceanographic atlas will be immediately responsive to any demand of the user, and will provide the precise detailed information requested without any extraneous information. The user will be able to interrogate the display to evoke further information; it will help him track down errors and will offer alternative forms of presentation. Thus, the display on the screen is not a static one; instead, it embodies animation as varying presentations are scanned. In a very real sense, the user “converses” with the machine about the stored data.’ (Pivar et al., 1963, p. 396)
What an amazing vision in 1963 – it would take another 30 years and even more before what they are describing became a reality!
The following presentation is a summary of the OSM quality assessment paper that I’ve posted here in August. It was presented in the UCL Centre for Advanced Spatial Analysis (CASA) S4 event which was held on the 8th January 2009.
The presentation does not include additional analysis to what included in the paper, apart from a graph that analyses the bias of coverage in comparison to the Index of Multiple Deprivation (Slide 37) which shows the analysis for urban areas only. In the slide, only areas with size up to single standard deviation from the average are shown. By and large, this means that only urban areas are included.
Just as 2008 ended, Marc Farr, Jess Wardlaw and Kate Jones were awarded the IJMR Collaborative Research Award from the Market Research Society. Jess is working with me on the Knowledge Transfer Partnership with Dr Foster Intelligence, while Kate is leading the GIS work on the Towards Successful Suburban Town Centres project. They’ve joined together to write the paper:
Farr, M., Wardlaw, J. and Jones, C. (2008) Tackling Health Inequalities using Geodemographics: A Social Marketing Approach. International Journal of Market Research, 50, 4, pp. 449-468.
As the title suggests, the paper was about Dr Foster’s social marketing work and how we use geodemographic data to target health interventions, and compared the methodology to traditional market research methods. The statement from the award commission is rather nice:
‘This new Award recognises genuine co-operation between the practitioner (agency, client, etc.) and academic communities. Tackling Health Inequalities using Geodemographics – A Social Marketing Approach is an excellent example of the innovative methods being applied to the challenges faced by the UK public sector. It demonstrates how social marketing is being adopted in targeting healthcare priorities, and the role played by Dr Foster Intelligence as a public-private partnership in providing information to help achieve this goal.’
So well done to Jess, Kate and Marc!
As part of the work on community mapping in Hackney Wick, we used the area for a project with the Development Planning Unit MSc students. As part of this work, and since we’re using Manifold GIS in this project, we offered the students the use of Manifold GIS for this exercise.
From an experienced system administrator perspective, installing the package and linking it to the licence server is a very quick and easy task. However, for the students it proved to be a difficult task – especially with Windows Vista where special procedures must be followed to enable the administrator account and install Manifold GIS. The process is rather scary for the average user, and the information architecture and links on the Manifold website are not clear enough to guide a novice, non-technical user through the installation process. As a result, many didn’t manage to make the package work. After a brief explanation and being pointed in the right direction, the installation issue was resolved.
This is a very interesting aspect of usability which, many times, is overlooked. When looking at a GIS or a component of geotechnology, it is worth evaluating its usability for different audiences. With software, I would differentiate between ‘end-user’, ‘programmer’ and ‘system manager’ usability. For each of these archetypes it is possible to evaluate whether the package is easy to use for this role. For example, programmer usability can be evaluated by examining how long it takes for a programmer to learn how to manipulate the system and perform a task with it. The new generation of APIs such as those that are used by OpenStreetMap or Google Maps are very programmer usable – it takes very little time to learn them and achieve something useful with the system.
The installation of Manifold GIS, therefore, scores high on system manager usability, but low on end-user usability – and, importantly, there are far more of the latter than the former. Some small changes to the website with a clear installation guide can improve the situation significantly, but a real change to the installation process that removes the need to switch to the administrator account is the real solution…
In October 2007, Francis Harvey commissioned me to write a review article for Geography Compass on Neogeography. The paper was written in collaboration with Alex Singleton at UCL and Chris Parker from the Ordnance Survey.
The paper covers several issues. Firstly, it provides an overview of the developments in Web mapping from the early 1990s to today. Secondly, in a similar way to my Nestoria interview, it explains the reasons for the changes that enabled the explosion of geography on the Web in 2005: GPS availability, Web standards, increased spread of broadband, and a new paradigm in programming APIs. These changes affected the usability of geographic technologies and started a new era in Web mapping. Thirdly, we describe several applications that demonstrate the new wave – the London Profiler, OS OpenSpace and OpenStreetMap. The description of OSM is somewhat truncated, so my IEEE Pervasive Computing paper provides a better discussion.
The abstract of the paper is:
‘The landscape of Internet mapping technologies has changed dramatically since 2005. New techniques are being used and new terms have been invented and entered the lexicon such as: mash-ups, crowdsourcing, neogeography and geostack. A whole range of websites and communities from the commercial Google Maps to the grassroots OpenStreetMap, and applications such as Platial, also have emerged. In their totality, these new applications represent a step change in the evolution of the area of Internet geographic applications (which some have termed the GeoWeb). The nature of this change warrants an explanation and an overview, as it has implications both for geographers and the public notion of Geography. This article provides a critical review of this newly emerging landscape, starting with an introduction to the concepts, technologies and structures that have emerged over the short period of intense innovation. It introduces the non-technical reader to them, suggests reasons for the neologism, explains the terminology, and provides a perspective on the current trends. Case studies are used to demonstrate this Web Mapping 2.0 era, and differentiate it from the previous generation of Internet mapping. Finally, the implications of these new techniques and the challenges they pose to geographic information science, geography and society at large are considered.’
The paper is accessible on the Geography Compass website, and if you don’t have access to the journal, but would like a copy, email me.
Nestoria is a property search engine covering the European market, based on Web 2.0 technologies such as mashups; in this case, a Google Maps mashup to show the locations of the properties. The company blog run a monthly interview and I had the pleasure of being the Nestoria interviewee for this month.
The interview addresses several aspects of neogeography, including the reasons for its rise and the implications for professional GISers. I comment on results from my evaluation of OpenStreetMap data and the implications of crowdsourced geographic information on businesses such as Nestoria.
The interview can be accessed on the Nestoria blog.