A Vast Machine:
Computer Models, Climate Data, and the Politics of Global Warming

Paul N. Edwards

The MIT Press 2010
A book review by Danny Yee © 2011 https://dannyreviews.com/
A Vast Machine presents "a historical account of climate science as a global knowledge infrastructure", emphasizing models and data and the relationships between them.

Edwards approaches this from the perspective of science and technology studies, with treatment of philosophical and sociological issues and only incidental biographical material. His approach is roughly chronological, describing the historical development of climate science and the earlier disciplines that constituted it, with a focus on atmospheric circulation and global temperatures.

A Vast Machine begins with an abstract overview, which readers should not allow to deter them. One key idea is that of infrastructural inversion, of the turning of developed theory back on data, to reanalyse and better understand it. Another focus is on model-data symbiosis. And obstacles in data collection, computation and processing are categorised as computational friction, data friction and metadata friction.

Setting the scene for the development of "global knowledge infrastructures", Edwards describes the creation of universal time standards and associated international organisations. He then traces the history of climate science before World War II, with its traditional separation into meteorology, theoretical atmospheric science, and climatology.

This finishes with Guy Stewart Callendar's 1938 revival of Thomas Chamberlin's "carbon dioxide theory". Callendar's effort was immense and his calculations meticulous, but he was unable to use the full range of data available due to "the great difficulty, cost, and slow speed of gathering large numbers of records in one place in a form suitable for massive calculation".

Vilhelm Bjerknes developed equations for large-scale weather dynamics early in the 20th century, but they proved too difficult to use. So the Bergen School promoted a more descriptive and pragmatic approach, which, "by focusing attention on prediction at the expense of theoretical understanding, probably delayed meteorology's unification by several decades".

I was amazed to learn that circulation modeling predates computers by nearly forty years. In 1910 Lewis Fry Richardson coordinated a six week hand-calculation, of a retrospective forecast for six hours of 20 May 1910, using a 200km grid over Europe with five vertical layers. The impracticality of this, given the limits of human computation, meant that numerical approaches were abandoned for decades.

The Second World War was a boost to weather forecasting and military influence continued into the Cold War, with the possibility of controlling the climate highlighted. Meteorology, and specifically Numerical Weather Prediction (NWP), played a key role in the early development of computers, with John von Neumann taking a prominent role; one of the first things run on ENIAC, in 1950, was a retrospective forecast.

Edwards provides a brief introduction to general circulation models (GCMs) and how they work before recounting their early history. Norman Phillips was the first to run a computerized GCM, in 1955, using a machine with 1kB of memory and 2kB of drum storage, but others soon followed. Edwards gives a brief history of four of the major research groups in the US. Gilbert Plass applied GCMs to the effects of carbon dioxide, introducing "doubling the concentration" as a paradigmatic modeling experiment, and it was quickly realised that water vapour feedback was critical.

In 1951, following a United Nations convention, the International Meteorological Organisation became the World Meteorological Organization. One early project was the International Geophysical Year 1957 — in which data was still distributed through micro-photographs of printed report forms. The WMO's early existence was intertwined with Cold War geopolitics, with military involvement particularly notable in expensive upper atmosphere science, in atomic fallout monitoring for test ban treaties, and in the use of weather and climate observations as cover for spying.

Commissioned in 1967 after years of planning, the World Weather Watch (WWW) became the central activity of the World Meteorological Organization. For political as well as technical reasons, it was implemented as a kind of internetwork, coupling autonomous systems (national weather services) which retained different internal protocols and standards. Its research arm, the Global Atmospheric Research Programme, carried out the First Global Weather Experiment in 1978-79.

Collecting global data is one problem; there is also the challenge of automating its processing and what Edwards calls "making data global", or generating the regular grid data needed for modeling. With observations poorly matching the needs of GCMs, gaps have to be filled by interpolation or by using the previous forecast. There are also challenges involved with using low quality data sources (where model analysis sometimes leads to instrumentation corrections).

"4-D data assimilation and analysis models form the core of the modern weather information infrastructure. They link thousands of heterogeneous, imperfect, incomplete data sources, render them commensurable, check them against one another, and interpolate in time and space to produce a uniform global data image. In my terms, they make data global."

NWP and "objective analysis" only slowly improved forecasting. There were problems using satellite imagery at first, with photographs and volume measurements hard to translate into the grid and point measurements of NWP. Models were built to apply theory, not test it — what Edwards calls "reproductionism" instead of reductionism.

Data is not neutral, but is collected for particular purposes.

"Over decades, climate scientists have accumulated a long litany of frustrations with the overwhelming focus of the global observing system on forecasting."

Weather observation systems, dominated by the immediate needs of forecasting, often threw away much of the original sensor data. One response has been infrastructural inversion, going back to old data and reinterpreting it, in a process of "continual self-interrogation". An example is the determination of what kind of temperature recording systems specific ships used, so that their logs can be readjusted individually. This is coupled with "reanalysis", going back and rerunning weather models over long periods using all the data available, including those that were delayed too long to be useful for forecasting.

Perhaps the most technical chapter in A Vast Machine is on "Parametrics and the Limits of Knowledge". Sub-grid processes in GCMs are handled by simple models with ad hoc parameters, and these models need to be tuned, creating a trade-off between more parameters — with greater realism but risks of over-parametrisation — and fewer — producing "why aren't you incorporating cosmic radiation?" complaints. There are serious epistemological issues over "verification" and "validation" of models; performing multiple runs with parameter variation is one approach to getting a grip on uncertainties. (For anyone interested in this topic, I recommend Keith Beven's Environmental Modelling: An Uncertain Future?.) It is, as Edwards puts it, models "all the way down": "all knowledge about climate change depends fundamentally on modeling".

Returning to the politics of atmospheric and simulation studies, Edwards traces the history of concerns about carbon dioxide driven global warming from the 1960s down to the present. Key events included the publication of The Limits to Growth, the 1979 "Charney" report (which estimated climate sensitivity at 3 +/- 1.5 degrees), and the Villach conference in 1985. Edwards also touches here on nuclear winter and ozone layer concerns.

A final chapter "Signal and Noise" considers a range of topics: the IPCC process and its uniqueness, "bringing the controversy within the consensus"; the unproductive interlocking of the scientific review process with political "review", especially in the United States; the controversy over the Microwave Sounding Unit, whose satellite measurements are still dependent on models; problems linking different models, often with very different inputs and outputs or on very different scales, and the use of Earth System Models; and the role of "citizen science" projects and online debates.

A conclusion briefly considers the parallel between climate data and modeling and econometrics and economic modeling.


In his introduction Edwards suggests which chapters people with different backgrounds will want to read. For those "most interested in climate policy or controversy studies, chapters 11, 14, and 15 may be the most rewarding" but, despite this and the "Politics of Global Warming" in the subtitle, A Vast Machine is unlikely to play much of a role in polemical exchanges. It may help those with genuine concerns about the role of models in climate science but, as a serious study of the history of science in its often ugly complexity, it will have little effect on the mindless nay-saying of most "sceptics" (unless perhaps they manage to abuse it through selective quote-mining).

For anyone wanting to understand climate data and climate models in their historical context, however, and how it is that we can have knowledge about climate change at all, A Vast Machine is supremely informative and insightful. It may lack the broad appeal of biographical popular science, but it is never hard to read, weaving historical detail and epistemological abstraction into a coherent narrative. It is also an important contribution to the philosophy and sociology of science, as a study of a discipline rather different to the experimental sciences that are still taken as archetypes.

January 2011

External links:
- buy from Bookshop.org
- buy from Amazon.com or Amazon.co.uk
- details at the MIT Press
- information from Paul N. Edwards
- share this review on Facebook or Twitter
Related reviews:
- books about climate + weather
- more history of science
- books published by The MIT Press
- other "best book" selections
%T A Vast Machine
%S Computer Models, Climate Data, and the Politics of Global Warming
%A Edwards, Paul N.
%I The MIT Press
%D 2010
%O hardcover, notes, index
%G ISBN-13 9780262013925
%P 518pp