Democratisation in Web 2.0 and the participation inequality

Continuing to reflect on Keen’s The Cult of the Amateur, I can’t fail to notice how Web 2.0 influences our daily lives – from the way we implement projects, to the role of experts and non-experts in the generation of knowledge. Some of the promises of Web 2.0 are problematic – especially the claim for ‘democratisation’.

Although Keen doesn’t discuss this point, Jakob Nielsen’s analysis of ‘Participation Inequality on the Web’ is pertinent here. As Nielsen notes, on Wikipedia 0.003% of users contribute two thirds of the content, with a further 0.2% contributing something and 99.8% who just use the information. Blogs are supposed to have a 95-5-0.1 (95% just read, 5% post infrequently, 0.1% post regularly). In Blogs, this posting inequality is enhanced by readership inequalities on the Web (power laws are influencing this domain, too – top blogs are read by far more people).

This aspect of access and influence means that the use of the word ‘democratisation’ is a misnomer to quite an extent. If anything, it is a weird laissez-faire democracy, where a few plutocrats rule. Not a democracy of the type that I’d like to live in.

The Cult of the Amateur – worth reading

I have just finished reading Andrew Keen’s The Cult of the Amateur, which, together with Paulina Borsook’s Cyberselfish, provides quite a good antidote to the overexcitement of The Long Tail, Wikinomics and a whole range publications about Web 2.0 that marvel in the ‘democratisation’ capacity of technology. Even if Keen’s and Borsook’s books are seen as dystopian (and in my opinion they are not), I think that through their popularity these critical analyses of current online culture are very valuable in encouraging reflection on how technology influences society.

The need for a critical reflection on technology and society stems from the fact that most of society seems to accept the ‘common-sense’ perspective that technology is a human activity which is neutral and ‘value-free’ (values here in the meaning of guiding principles in life) – that it can be used for good ends or bad ones, but by itself it does not encapsulate any values internally.

In contrast, I personally prefer Andrew Feenberg’s analysis in Questioning Technology and Transforming Technology where he suggests that a more complete attitude towards technology must accept that technology encapsulates certain values and that these values should be taken into account when we evaluate the impact of new technologies on our life.

In Feenberg’s terms, we should not separate means from ends and should understand how certain cultural values influence technological projects and end up integrated in them. For example, Wikipedia’s decision to ‘level the playing field’ so experts do not have any more authority in editing content than other contributors should be seen as a an important value judgment, suggesting that expertise is not important or significant or that experts cannot be trusted. Such a point of view does have an impact on a tool that it widely used and therefore influences society.

The Environment Agency’s Pollution Maps and how not to present environmental information

As part of the Mapping Change for Sustainable Communities project, we organised the first workshop in the Royal Docks area, at the Sunborn Yacht Hotel last Saturday (27/10). The workshop was very successful and, as I usually do in these workshops, I start with ‘what mapping information can we find on the WWW about your locality’. I’ve been doing it now for about 7 or 8 years, but during the period, the Environment Agency’s Pollution Inventory maps never failed me as an example for technocratic dissemination of information which is not helping the people on the ground.

I find that, in all these workshops with people from many communities across London, very few knew about the Environment Agency information, let alone ever accessed it independently had. As the participants are usually from community or environmental interest groups, they are interested in the information – they just don’t know where to find it. I associate this lack of awareness with the fact that users find the information unfriendly and unhelpful, so there is no ‘word of mouth’ effect that leads to more use of the site. As someone in our Saturday workshop declared, ‘these maps are not written in community language or for community use’ – yet, they tick all the boxes of the Aarhus convention

To understand what’s wrong, see the image below, which provides a view of the site on an average monitor (1024×768):
Environment Agency Pollution Inventory.
The header area is so big that all that is left is a fairly small area for the map.

Furthermore, as the full image of the page shows, the map is supposed to offer several layers of pollution data (on the right-hand side) but, as many of the layers include point data about the same site – all using the same symbols which overlap one another – the user can’t see if, in a given location, there is information from multiple categories.
Environment Agency Pollution Inventory
The area of the map is very small (less than 400×400 pixels), and people find it very difficult to locate where they are or where the postcode is that they have selected in relation to the information on the map. Zooming in to the largest scale, or at any stage during the process, the system will run a query and provide information about the specific location only if the ‘learn more’ option is selected. Even as a more frequent user, I fail to click on this option and find the interaction with the system frustrating.

This site demonstrates that the Aarhus model of access to information, which is ‘we’ll build it and they’ll come’, is not sufficient and that a more user-centred approach is required to achieve public access to environmental information.

Getting the right projection – a helpful usability feature of Manifold GIS

UCL’s licence for Manifold GIS 8.0 finally arrived. While testing the new 64-bit version I was reminded of one of the interface features of Manifold that I believe many other GIS should have as standard – a request to verify projections when a new component is added to the project.

One of the most confusing issues for new GIS users is to use projections within their workflow. Nowadays, it is common to integrate data from different sources, such as information gathered by GPS receivers with data from the Ordnance Survey, or any other data that is using a local projection. Therefore, it is important to ensure that the system ‘knows’ what the projection of each layer and image is.

Without proper configuration when trying to put all the data together, it doesn’t work because the projection of one layer doesn’t match another layer.

In Manifold GIS, when an image or vector layer is imported, when it is opened for the first time, the system asks the user to verify the projection, and opens the interface that allows the assignment of the current projection. Unfortunately, Manifold GIS does not give the option to set the most common the default projection for the locale in which the system is used – or at least set a group of favourite projections. Room for improvement there!

As for Manifold GIS 64 bit – it seems to work faster, although it was a surprise to see that in some operations the 4 cores were not busy at 100% or even 50% even though the system loads the data slowly. Apart from that, Windows Vista 64 bit is quite incompatible with many legacy applications and it is quite a pain to use. Maybe it’s time to return to Windows XP…

Manifold GIS - verify projection message.

Things that you learn at conferences…

During the Intenrnational Cartographic Conference in Moscow last August, one of the presenters, while discussing GeoVisualisation, showed the Röyksopp (2002) ‘Remind me’ clip. As it is has been so long since I’ve followed MTV, or music videos on YouTube it was the first time I had seen it…

The common comment on this brilliant videoclip is that it is about infographics. The designers of the video stated that

“as graphic designers we appreciate the way statistics can describe the whole world. It’s funny and frightening how the smallest aspect of the way we live can be translated into numbers. It also shows how predetermined our lives can seem from this point of view.”

However, there are some interesting Geographical aspects: notice how much of the information is spatial and how scale plays an important role in the transitions between different visualisation. Other Geographical notions that this video prompts are Globalisation, Western Urbanisation, the culture and geography of consumption and surely several more.

Interestingly, 3D representation is not so central and much of the information is provided through 2D representation.

Linking Environmental Information, GIS and Usability

One of the questions that might arise from a look at my publications and work is ‘how public access to environmental information links to GIS, and what are the reasons to explore usability and HCI in this context?

The answer is straightforward – there are strong links between all 3 areas and, in order to make sense of one of them, you need the others. In my 2001 paper ‘Public access to environmental information: past, present and future’ I go into more details , but here is my current summary of the issue.

One of the core concepts of environmental decision making is the use of information. In current environmental debates you can see how much opponents dispute the accuracy and validity of information, but rarely dispute the need for information or the role of information in decision making. This approach to information in decision making can be traced back more than 40 years, and has been a constant feature of environmental politics.

The next element in the chain is Geographical Technologies – GIS, Remote Sensing, ground-based monitoring and the like. One of the features of environmental information is the heavy reliance on these technologies that, historically, the environmental field adopted early on. For example, consider the following (rephrased) paragraph:

‘Existing technology now makes possible the development of a global resource data base (GRID), which will be a data management service designed to convert environmental data into information usable by decision makers. The technical feasibility of GRID has been assessed by expert groups.

‘GRID technology allows us initially to describe, eventually to understand and ultimately to predict and manage the environment.’

This is not a description of Digital Earth or from a specification document of Google Earth – it is based on UN Environmental Programme documents from 1985 and 1986, when the GRID system was in its first stages. Even today we don’t have anything like the system that is outlined above. Was it a visionary view of the potential of GIS or yet another example of technophilia? In any case, it shows the strong link between GIS and environmental information.

The next element is usability and Human-Computer Interaction. GIS are hard to use (more about this in other posts) and the reliance on them to deliver information creates real obstacles for occasional users – which most users are. I have observed the difficulties of intelligent and competent people during workshops where they were faced with the task of using GIS or web mapping technologies. That’s where my interest in this area emerged.

In summary, the challenge of providing environmental information to the public is not just the technical one of making it available over the Internet – without understanding how to make GIS more accessible for the average user and understanding which are the most efficient, effective and enjoyable ways of helping people to use the information effectively, we can’t really deliver on the promises of the Aarhus convention on public access to information, participation and justice.