articulate the relevance of historical
work to present-day debates without
falling victim to what historians call
“presentism” (misinterpreting historical events in the light of present-day
knowledge or perspectives).
The third CHM Prize winner, A Vast
Machine: Computer Models, Climate
Data, and the Politics of Global Warming by Paul Edwards (MIT 2010, CHM
Prize winner 2011) is an outstanding
example of the potential for historians
to contribute to broader public debates
and give non-specialists insight into
the work done by scientists and the
process by which computer simulation has transformed scientific practice. Edwards tackles one of the most
politically polarizing topics in U.S. science today: the connection of climate
models to the real world. Without
computers we could calculate average temperatures and plot trends, but
only computer models can separate
underlying climate trends from local
or random fluctuations, project their
future trend, or test explanations of
the physical processes at work against
the underlying data.
Our traditional idea of science, em-
braced by many scientists, is that sci-
entists collect objective observations
about the world and then formulate
theories to explain them. Scholars in
the field of science and technology
studies, in which Edwards is trained,
have instead stressed that nothing
can be perceived except through one
or another set of theories and assump-
tions. In climate science, as in much
modern science, data points from the
natural world become knowledge of a
kind that can support or challenge a
theory only after they are processed in
computer models. These models are
themselves based on theories. Thus,
as Edwards succinctly puts in an in-
troduction aimed at general readers,
“without models there are no data.”
This is not to say that Edwards is con-
tent, as some earlier radical scholars
in science and technology studies
were, simply to establish that the sci-
entific knowledge he is examining was
“socially constructed” and exit in tri-
umph. We have arrived at an odd mo-
ment where this strategy, once associ-
ated with the academic left, is now a
mainstay of the political right.
4. Computer technologies
are always Political
Eden Medina’s book Cybernetic Revolutionaries: Technology and Politics in
Allende’s Chile (MIT 2011, CHM Prize
winner 2012) takes a close look at the
period from 1971 to 1973 when the
British operations research specialist Stafford Beer was hired by Salvador Allende’s short-lived democratic
Marxist government in Chile to implement his newly developed vision of cybernetic control. The boldest version
of this Cybersym project imagined
traditional political control replaced
entirely by a new system in which decisions were influenced to the greatest possible extent by the input of
ordinary citizens, industrial production was organized with the help of
constantly updated computer models
of the entire national economy, and
decisions were based on information
rather than bureaucratic self-interest.
Beer modeled his plans on an abstracted view of the human nervous
system in accordance with the central idea of cybernetics, which was
that artificial, natural, and living systems are all governed by conceptually
equivalent processes of feedback and
control. Cybernetics originated in the
1940s with close ties to early work on
computers and the support of many
of the U.S.’s brightest minds across
a range of disciplines. By the 1970s it
was still fairly prominent in popular
culture but was already sliding into
obscure eccentricity within science
as researchers favored more focused
disciplinary approaches such as artifi-