Published in Science as Culture, 11, 2, 2002, 191-214.Figures not inserted in this version.
Operating Issue Networks on the Web
by Richard Rogers
Arguably, a small segment of organisations, governments and corporations have moved beyond using to operating the Internet. One of the beginnings of net operation lay in the emergence of Internet intelligence and monitoring companies. In 1995 the first of these companies was viewed in light of its ready off-Web comparison, the news-clipping service. Gradually, however, the companies have become known as Internet surveillance firms, where the main services comprise actively monitoring whats being said on the net by their competitors (on competitor Web sites), by their critics (on rogue and other Web sites), and by Chinese whisperers (on newsgroups, mailing lists and listservs) (New York Times, 1999). To their clients the monitoring companies send indices, largely in the well-known styles of Web and newsgroup search engine returns. Depending on the severity of the word on the net, the surveillance companies also recommend some form of action to be taken, in order to quash the rumour or put a more permanent end to the mongering (see figure 1).
While much of the more serious recommended action appears to be of the legal variety, in the form of trademark infringement, unfair trade and/or liable law suits, the less severe falls into the realm of media tactics. Indeed, certain corporations under fire for their human rights and environmental records often engage in tactical net media (Garcia and Lovink, 1999). Be it the rather standard discursive greenwashing of products, the more subtle creation of anger venting net graffiti spaces as the Shell forum, or the establishment of hyperlinks to ones critics, such net manueuvering becomes immediately legible and trackable on the Internet knowledge maps, as we come to later in introducing new Web debate mapping techniques, based on basic socio-epistemological analysis (Lubbers, 1999; Sassen, 1998; Rogers and Marres, 2000).
Especially in times of semiotic crisis (say, a spate of hearsay, followed by a drop in the companys share price), one may imagine intervening in the fray through an understanding of the net manueuverings of multiple actors on particular issues. With discourse and link mapping in operation, the organisational maneuverings would assume the form of rewording Web sites, and linking or delinking to ones ever-changing friends and foes, supporters and debunkers. In other words, organizations would operate the Internet in this new manner by reading their link and discursive positionings on knowledge maps, and nuancing their Web presence accordingly.
Insert Figure 1 about here
Figure 1: Recommended action against rogue Web sites by Cyveillance.com, March 1999. Source: cyveillance.com
THE INTERNET AS RUMOUR MILL
The Internet has been a medium of dubious repute (Dean, 1997). In the United States, where the leading monitoring firms are based, the idea of the Internet as rumour mill gained currency among leading (traditional) news organisations and beyond, with the help of 60 Minutes segment, broadcasted on CBS television in 1997 (see figure 2). In the show the practices of both a notorious Internet rumour-mongerer (the conspirator with a Harvard Ph.D.) and a monitor (the seasoned net sleuth, from a leading trade publication) were followed; it provided a picture of the amount of hearsay on the net, the ease with which it is spread and the makings of a burgeoning industry dedicated to tracking it all down. The segment concluded with a demonstration of how unscrupulous net operators could send messages as if they were from the 60 Minutes journalist (Diane Sawyer) or even the American president. Anything goes in this realm of information renegades and rumour raconteurs, the programme concluded.
Insert Figure 2 about here
Figure 2. Opening and other clips from the Internet Rumor Mill,60 Minutes, 1997. Source: ewatch.com
On and off the net, tales abounded concerning how usually reliable sources (e.g., the seasoned journalist) were taken in by word on the net, unintentionally spreading a rumour. In the chain of Chinese whispers the word is reposted to lists and friends (in a chain letter effect), arguably living a much longer life on the global net than it would on the local street, finding new believers in different settings, a la P.T. Barnum. The alleged downing of TWA 800 by a U.S. Navy surface-to-air missile is one rather well-known example, also reported in the 60 Minutes segment. Another is reported by a RAND analyst, in reference to the events on the ground and on the net during the Zapatista uprising in Chiapas, Mexico.
One short e-mail message posted in February 1995 remains particularly notorious. In it, a U.S. professor sounded a warning, reportedly telephoned to him by activists on the scene, that army troops were on the move, bombs were being dropped, and bodies were piling up in a hospital in a town near San Crist
Internet surveillance companies have their own lists of Internet crisis case histories, testifying to the idea that net rumour and innuendo is a serious matter and may cause significant damage to an organisations image and brands. The examples provided by Cybercheck, one of the Internet intelligence monitoring firms, with corporate clients, are indicative, and humorous. For example, the share price of Mrs. Fields took a fall after the net spread the rumour that the cookie company fed O.J. Simpsons acquittal party.
The ideas and tactics of the Internet surveillance companies moving in other quarters. Mats Karlsson, World Bank vice-president for external affairs, recently called for a system "to handle concerted email campaigns or attacks", while "exploring ways to play a more active role on the web sites of bank critics" (Financial Times, 2001).
While surveillance companies currently publicise only corporate cases (also for the benefit of future corporate client recruitment), governments and non-governmental organisations also have been subject to rumour-mongering, rogueing and smearing. From concerned surfers Amnesty International, for example, learned of an alleged rogue site produced by a Tunisian group (www.amnesty-tunisia.org) with pro-governmental sympathies. The GINGO site (a descriptor for a kind of NGO site) put a different face on Tunisias human rights record, challenging and confusing Amnestys claims.
There are a series of techniques employed by the dubious information providers. Some of the planned confusion or critical tactics is a result of domain name fudging, and organisations are advised (by Network Solutions and other domain name providers) to reserve all common suffixes, .net, .com and .org, not to mention the new domain suffices as .info and .biz. Greenpeace, for example, owns Greenpeace.com, as a quick search of whois.net demonstrates. At Greenpeace.com a rather stern message points the surfer to Greenpeace.org.
From the real organisations and the monitoring companies points of view, the question revolves around the extent of the rogues impostering. Are they posing as the real? Can they thus be silenced, through off-Web legal means? On the Web, however, the question reads, do the rogues have presence within the discourse? Are they present in the same spaces as the real sources? Are these spaces reputable, i.e., are these spaces, be they search engines or other indices, endeavouring to author reputability? If so, are the rogues appearing as reputable?
In sum, cybersurveillance companies, in an effort to aid organisations in image-washing, supply indices (like search engine returns) of the rumours being spread on newsgroups and listserv discussions, as well as on certain self-chosen rogue Web sites. The client organisation, taking advice from monitoring companies, proceeds by ignoring, engaging, threatening or bringing suit. An organisation concerns itself with the rumour merchants on the Web itself only insofar as their critique, parody and/or polemic are well-spread or easily locatable by surfers, actively seeking the real or at least reliable sources, as promised by the search engine or other net reliability generators. Thus the main issue for organisations is the extent of the rumour merchants position vis a vis the organisations own position, in a space authorised by the reliability provider.
Insert Figure 3 about here
Figure 3: A rogue Shell Web site by ®™ark.com. Source: rtmark.com/shell.html.
GAINING SENSES OF NET PRESENCE: THE HIT AND LINK ECONOMIES
It was once thought that an organisations image and presence on the net is a product of the design of its Web site, and corporate marketing departments, for example, have spent large sums on a stunning net presence, with an emphasis on information design and later Web-delivered services, such as on-line shopping with encrypted messaging for credit card and other secure information transfers. As the networked nature of the medium became more apparent, certain organisations would design their sites to keep the surfer within its frame. If the surfer wished to move on via a hyperlink, he or she would remain within organisation xs site, browsing through ys site in the xs frame. In another example of such (omni-)presence design, organisations, particularly corporations, would provide no outward hyperlinks from their sites. The surfer, it was hoped, would remain within that organisations frame of reference. With the advent of the net as rumour mill, it has become increasingly clear, though, that an organisations net presence derives from far more than site design and service delivery, and the maintenance of ones frame around the rest of the Web.
One way to think through new notions of net presence is to understand two types of Internet economies, the hit economy and the link economy. For some years the Internet was run according to a hit economy, with advertisers and other product, service, entertainment and information providers wishing to appear on and otherwise associate itself with the most hit sites, relative to the type of content. Indeed, organisations (e.g., content providers) make agreements with portals (e.g., entry providers as America On-Line and Microsoft Network) to place their icons (with hyperlinks) on the opening page, thereby establishing themselves as one of the favoured points of entry on the Internet. Because of deal-making between portals Internet browsing software, portals rank towards the top of hit lists, as do those which have paid for preferred placement on the opening portal pages. Recently, it was reported that there has been a winnowing of the quantity of sites visited on the Web overall, with MSNBC receiving by far the greatest share of the traffic for news portals (New York Times, 2001b).
In the hit economy organisations also are making agreements with search engine companies, as the surfacing of AltaVistas preferred placement service showed. As with advertising banners appearing on search engine return pages, the paid placement services boost particular sites when certain key words are entered. The services act above and beyond key word information design, HTML coding options (metatagging) and url submission, all of which together normally had ensured that sites are located by engine crawlers and well-ranked by the engines. In AltaVista clever site design and management yielded to brute payment. Despite the controversy surrounding AltaVista spat, preferred placement is still practised (New York Times, 2001a).
Insert figure 5 about here.
Figure 4. Signs of the hit economy. Hit Table by Mediametrix, March 1999. Source: mediametrix.com.
Whether by portal or search engine placement, preferred sites are granted a larger audience (more hits). The organisational strategy thus revolves around establishing robust portal and search engine presence. In all, the combination of crawler-digestible coding, key word information design and favoured placement is an organisations modus operandus in the hit economy. Robust net presence is subsequently demonstrated on hit tables, which drives Web advertising, the seminal form of e-commerce (see figure 4).
A second, less obvious aspect of net presence concerns the related link economy. The newer search engines place sites higher in their ranked returns if they are deemed authoritative sites, or sites with many inward links (Gibson et al., 1998). Google operates according to this principle. Sites with larger amounts of links to it are thought to be more relevant on the net (or, put differently, deemed more relevant by the net). As indicated above, engineering links to ones site may involve making preferred placement arrangements with portals. It may involve subscribing to a link-generator organisation, such as linkexchange.com or linkpopularity.com, both of which guarantee greater levels of site traffic by providing more links to the site. For the vast majority of organisations, however, its a matter of soliciting links, for instance by expressing interest in organising reciprocal linking. Links in may be gained through such hyperlink diplomacy; they also made be granted to a site independently, either at the whim of another page-maker or as an apparent matter of anothers organisational policy.
On the Web granting a link (as a reference in science) and receiving a link (as a citation in science) are akin to positioning oneself and being positioned by another, respectively (Wouters, 1999). Cognisance of such positionings may lead to consideration of presence strategy. To whom shall the organisation link? Does the organisation desire to have a link from another organisation? What sorts of politics of association may the linking structures around an organisation imply? What may be read between the links, and by whom? In order for such considerations to have greater purchase, the status of the link must be raised beyond Webometric and search engine logic environments, which rely mainly on counting inward links (Rogers, 1996).
From Link Economies to Web Epistemologies
Existing research on Web communities has found that sites frequently linked
to have a greater level of authority (and surfer relevance) than those receiving fewer links in. The reasoning for the authority, borne out in initial experimentation that eventually led to engine logics such as Googles, relates to the idea that on the whole Webmasters and pagemakers make the most links to the most authentic (and thereby reputable) sources for the broad topic being searched. In the example given in one particular piece of empirical research, the topic Harvard returned some 800,000 pages in standard search engines, with www.harvard.edu not appearing in the first sets of ten (Gibson et al., 1998). With a hyperlink-induced topic search, which takes into account links in and the pointer text, the colleges main site appeared first, with commentaries on Harvard or other uses of the word dropping to lower rankings. Harvard.edu has the most links in from pages where the word Harvard appears. In the returns the main site was followed by www.hbs.harvard.edu, www.law.harvard.edu and ksgWeb.harvard.edu.
Insert Figure 5 top and bottom about here
Figure 5. Signs of the link economy. The Alexa tool bar, with individual site stats, including related links, and links in as an indication of reputability. Source: alexa.com.
Thus search engines operating with such link authority measures as Google - could induce movement away from a pure hit economy to a link-and-hit economy, at least for those surfers preferring search engines as their entry point to the Web. The sites with the most links in would show highest on the rankings. Presumably they then would be hit more frequently.
The demonstrable value of link-related logics for establishing topic source authority provides some way towards grappling with the rumour mill, as far as basic Web epistemology is concerned. Indeed separating the wheat from the chaff has long been the key issue for both engines (ranking) and surfers (searching), desiring to find the real source. The chaos of the Internet (to use the vernacular) may be viewed as a product of the lack of source authority, in an information free-for-all brought about by 800,000 pages with something to say about Harvard, all being listed by engines, returning sites with frequent Harvard key words and Harvard metatags (cf. Rogers, 1998). Engines with link authority logics, where www.harvard.edu rises to the top of the returns, author a new form of basic Web epistemology (Waltz, 1998). That is, they provide an indication of the status of information according to measurable reputability dynamics given by the Web.
In yielding greater real source reliability, where an institution is queried, search engines using link authority logics thereby also would remedy rogueing, and, from the real sources perspective, the potential epistemological crisis resulting from surfers consulting the rogue site over the source site. In the case of epistemological competition (the imposter versus the real), link authority points to shell.com as opposed to the imitation site by an art group (see figure 3), and to amnesty.org as opposed to amnesty-tunisia.org. The rogues have less authority, for they have been granted fewer links in, with the link pointer text amnesty.
Whenever the query relates not to an institution but to an issue, epistemological competition becomes even greater. Taking an example from our research, well over one million pages have something to say about climate change. Since there is no such thing as the real climate change site, the challenge concerns locating and querying not the one real source but the discourse. Here the issue revolves around using organisational linking logics in order to reliably author the discourse, and be able to query it for positionings and positions in a debate. In a variation on the English parliamentary expression, where you stand depends on where youre positioned.
FROM WEB EPISTEMOLOGIES TO WEB DISCOURSES: GOVCOM.ORG
Before explaining the kind of (epistemological) thinking behind Web discourse, it is instructive to introduce a broad distinction about the Internet that arises from the two types of searches mentioned above, i.e., a search for an institution or for a subject matter. Seeking the one source (i.e., the real Harvard) may be said to carry with it the idea that the Web is made up of single sources. In this view, the Web becomes a series of brochures, catalogues, market stalls, or kiosks of individual organisations. Each kiosk authors and provides information about itself.
The Web, however, may be more readily conceived of as a hypertext, or, adding images, videos and sounds, a hypermedia environment, where conventional notions of both information provision and authorship are challenged. For our purposes the Web is thought of not as a series of information spaces authored by single sources but as discourse spaces to be authored by surfers (in the case of hypertext theory) or, in this case, to be authored by cartographic techniques and tools.
Just as link authority engines now author the reputable source (the real Harvard) and filter out the rogues, logics may be conceived to author a discourse. Moving from a source to a discourse perspective, Web pages are viewed as a series of inputs into discourses to be queried for standpoints within discourses across sites. Those sites (and viewpoints) not recognised as relevant by the discourse would be filtered as if hearsay.
Moving away from the basic epistemology of real source (fact) and imposter (fiction), the epistemological notion sought here concerns positioning presence. Are the organisation and organisations own points of view present in the Web discourse? Has the network making up the discourse granted that organisation and that point of view presence?
Govcom.org is a conceptual url indicating three leading discursive source types on the Web and the three leading protagonist types in contemporary societal debates. (More a support act, .edu or .ac is thought of as being incorporated in the viewpoints and positionings of the .govs, .coms and .orgs.) The conceptual url follows from projects to map the climate change and genetically modified food debates on the World Wide Web. Like the Harvard.edu researchers, here the project began with the observation that search engine returns (in the hit economy) provide little in the way of knowledgeable relevance and reputability ranking. When faced with a long list of sites containing the key word climate change or gm food, it is difficult to gauge the status and relevance of each of the organisations knowledge and position vis a vis other organisations, apart from the fact that they are on the list. Which of the organisations are key players in the debate? Whom to trust? How can one tell?
When faced with such a list, the knowledge surfer is liable to begin with an organisation s/he already trusts (the reputable source, as Harvard.edu). In order to gain a picture of the issue, one is apt to follow the outward links from one trustworthy source to other sources deemed relevant (through the hyperlinks) by ones initial reputable site. If, however, one encounters a site without any outward links from its climate change page (as a corporation without outward links, employing the style of net omnipresence), the surfer may backtrack and follow another route, or one may very well lose track.
Here the experience of purposive mall shopping is analogous to Web knowledge surfing. After a period of on-the-ball browsing, one is likely to lose track of the original purpose. Lulled by the diversions of the net (as by those in the mall), most everything becomes of mild interest until that time when one regains sense of purpose.
Knowledge or discourse maps, plotted on the basis of trustworthy interorganisational hyperlinking patterns, provide a way out of the information daze. A potential knowledge surfers course may be made visible by mapping the discourse and a number of routes through the reputable links the surfer may wish to follow (Marres and Rogers, 1999; Rogers and Zelman, 2001). So instead of mapping individual site structure to gain a sense of depth, as many artistic (WebStalker) and commercially available products (Inxight on AltaVistas Discovery) do, here a set of sites is mapped through hyperlinks to gain a sense of the breadth of a discourse. The links out of a group of initial reputable sources, involved in a debate as climate change, are taken as the population, and those interlinking sufficiently within the population constitute the sample of the network. The connections (or links) between organisations are mapped, and routes are thereby found through the network.
The socio-epistemological logic behind the mappings and the routings has to do with reputation and knowledge networks. Interorganisational hyperlinkings provide a semblance of a knowledge and reputation network between organisations. Somewhat akin to a footnote in a manuscript, a hyperlink is thought of here as an acknowledgement by one organisation of another organisations relevance to the discourse, based on some appreciation for that latter organisations knowledge and reputation. A link indicates belonging. Depending on the size of the population (of organisations), multiple acknowledgements of relevance through hyperlinking determine the sample. Once hyperlinks are taken to be acknowledgements of relevance to a discourse, the network maps may be thought to reveal one or more discourses, within which debates are taking place. The spread of govcom.org (or .govs, .coms and .orgs) within the discourse is a sign of (transdiscursive) debate.
Insert Figure 6 about here
Figure 6: Organisational Linking Patterns (and potential routes) in the Climate Change discourse on the Web, November 1998. See appendix for legend. Map by Noortje Marres and Alex Bruce Wilkie.
Thus in govcom.org the Web is conceived of as a series of discourse or debate spaces awaiting mapping, and routing instructions. Organisations linkings provide the potential routes to be taken; routes through the discourse may be conceived of as storylines. Both the authoring and the authorising entity are the network, not the surfer (cf. Landow, 1997).
Where an organisation is positioned in such a discourse (and on a discourse map) thus becomes a matter of organisational concern. For example, is it central, peripheral or absent? From the discourses perspective, is it relevant? Does an organisations point of view matter if it is absent from the discourse map?
In sum, unlike the hit, and hit-and-link, economy search engines, which respectively find pages on the basis of each sites tagging and key word placements, or on the basis of quantitative reputability measures, govcom.org rethinks the search engine as a debate landscaping device, providing discursive authority logics (whos in, whos out), interorganisational routing instructions (see figure 6) and indications of levels of transdiscursive debate. Starting from a cluster of trusted sites dealing with an issue, the technique maps the linking logics of organisations to demarcate the discourse and to determine potential stories within it.
CONCLUSION: FROM PRESENCE TO BUBBLE OPERATION
A history of web presence strategising, as I have outlined above, begins with the recognition by firms and others of the impacts the Internet rumour mill may have on share prices and brand recognition. In the mid to late 1990s, Internet surveillance companies stepped in, offering services to chart word on the net, and recommend tactical action to be taken against imposters and critics, influencing reputation.
With the transition from the hit economy to the link economy, in the late 1990s, it was recognised, at the same time, that an organisations web reputation no longer derives from its site design. Rather, it is a product of the organisations showing in reputable spaces.
Indeed, attempts are often made to buy presence in spaces where source reputability is authored. These spaces include directories, portals and search engine returns. For example, Shells viewpoints on its own sensitive issues have appeared as advertisement banners on AltaVista whenever "Greenpeace", "sustainable development" and other environmentalist terms are entered into the engine or returned in the first sets of ten (see figure 7).
Insert Figure 7 about here
Figure 7. A portion of the Shell splash appearing in AltaVista when "Greenpeace" is queried or returned in the first sets of ten, August 1999. Source: via altavista.com
In the research described above, news means have been developed to author reputable space the issue network and understand presence and debate in it. The research takes advantage of insights and techniques from such reputation-makers in the link economy as Google, and moves beyond them. If hyperlinks are understood as source recommendations per issue area, techniques may be developed to find and chart issue networks. These maps show the relevant sources per issue, and also what each is saying about the issue. The organisations outside the issue network are filtered out.
From the point of view of socio-epistemological discursive mapping, organisational net presence derives from overall positioning, that is, with whom an organisation is linked (inwards and outwards) and thus associated (figure 6), as well as where it stands on the issue once they are queried for substantive position (see figures 8a and 8b).
Insert Figure 8a about here
Figure 8a: Discursive Issue Network Map of Developing Countries and Climate Change, November 1998. See appendix for legend. Map by Noortje Marres and Alex Bruce Wilkie.
Insert Figure 8b about here
Figure 8b: Discursive Issue Network Map of Uncertainty and Climate Change, November 1998. See appendix for legend. Map by Noortje Marres and Alex Bruce Wilkie.
Where climate change is concerned, a topic queried from the link maps could be developing countries, and the mapped organisations stances on this subissue may be depicted on discursive maps. One notes that in the developing countries map, corporations provide little input to the story on developing countries and climate change. Only the corporate lobby group (Global Climate Coalition or GCC) has chosen to take a stand on developing countries. In the uncertainty and climate change discourse map, the NGOs provide scant input, preferring not to engage in the debate about the uncertain science of climate change.
Discursive maps are thought of as generally useful tools for thought, providing one understanding of particular organisational relationships, that is, interorganisational relevance and reputability acknowledgements on the basis of hyperlinks. The relevance logics demarcate discursive networks, and govcom.org queries show whether there are great debates ongoing, in the making or entirely absent.
Govcom.org also asks, does the discourse exhibit transdiscursivity? Are organisations acknowledging and deeming relevant players outside their immediate domain of understanding? Where on the Web do we find the elusive neo-pluralist potential, where there is not only cross-domain acknowledgement but transdiscursive viewpoint engagement? In the climate change debate, we do have multiple domains present, and we have found the makings of debate around the principle knowledge claim of the Intergovernmental Panel on Climate Change, "the balance of evidence suggests a discernible human influence on global climate (Rogers, 1998). But here our discourse maps reveal only modest open debate on the subissues.
As has been argued, the maps also provide potential knowledge surfer routes through complex discourses. The maps thereby potentially author storylines, as we noted with developing countries and uncertainty. In all, the maps author a social epistemology in the sense that they provide ways of coming to know about climate change using issue networks, not individual Web sites, as the source.
Once at least the kind of thinking behind the govcom.org map-making and map-reading become apparent to the organisations on the maps, they may well find themselves in need of newfangled Web operators. I have dubbed these newfangled analysts bubble operators.
The bubble is a term borrowed from U.S. Navy command and control centre operation, as on a battleship; it also has been applied to the expert operation of other critical systems, as nuclear power plants and air traffic control. It similarly may apply to news-gathering or to computerised stock trading environments. Two quotations provide an introduction to the notion of the bubble, or more specifically, having the bubble.
The bubble is a not a metaphor for the cognitive map or representation; rather having the bubble expresses the state of being in control of one.
To declare publicly that you have lost the bubble is an action that is deemed praiseworthy in the Navy because of the potential cost of trying to fake it. Depending upon the situation, either one or more of the other people in the center will feel they have enough of the bubble to step in, or everyone will scramble to try to hold the image together by piecemeal contributions. As a rule, even a few minutes of relief will allow the original officer to reconstruct the bubble and continue (Rochlin, 1997, 240).
The term having the bubble conveys the cognitive management of numerous information streams and instrument readings in situ. One maintains the image of the complex information landscape, and continually translates ones understanding of the situation into communication outputs. The operator mentally maps the information and dynamically interacts with a system increasingly of his own (cognitive) making.
Employing govcom.org logics and representing inter-organisational linking and discursive relationships on maps, one may track an organisations 'showing' on particular issues and thereby understand its net presence. Subsequently, operators - 'having the bubble' - would take real-time decisions to re-word the Web site, link or delink, in order to nuance that presence.
.org (Non-Governmental Organisations):
1. Gpeace Greenpeace
2. FoE Friends of the Earth
3. WWF World Wildlife Fund
4. ECO ECO Climate Action News Network
5. EDF Environmental Defense Fund
6. Shoe Shoeworld's On-line Petition lobbying Nike
1. Shell Royal Dutch/Shell Company
2. Mobil Mobil Oil
3. BP British Petroleum
4. Texaco Texaco
5. Ford Ford Motor Company
6. Chevron Chevron
.gov (Governmental Organisations):
1. IPCC Intergovernmental Panel on Climate Change
2. UNFCCC United Nations Framework for the Climate Change Convention
3. UNEP United Nations Environmental Programme
4. WMO World Meteorological Organisation
5. DoE Department of the Environment (United Kingdom)
6. HCuk The Hadley Centre
.org (Delegate Organisations):
1. IEA International Energy Agency
2. GCC Global Climate Coalition
3. GEIC Global Environment Information Center
About the author
Dr. Richard Rogers is Assistant Professor in New Media at the University of Amsterdam, and Visiting Professor in the Philosophy and Social Study of Science at the University of Vienna.