cial intelligence and cognitive science
over the ostentatious universality of
Beer and his collaborators never
came close to fulfilling their grand vision, though they did produce some
economic models of little practical
use and an “operations room” with
an obvious debt to the bridge of the
Starship Enterprise. As the revolution crumbled under a U.S. economic
blockade and a series of strikes their
most practical contribution to its
defense was the national telex network, low tech even by the standards
of the 1970s, which proved useful for
centralized control of emergency responses. Beer himself was changed
by his experiences in Chile, devoting
himself to fixing the world rather than
making money. The urbane lover of
fine living gave up his Rolls Royce to
spend much of his later career as a
mystic, living simply in a primitive rural cottage.
This would seem to offer rich materials for a farce, or perhaps a tragicomic opera like those featuring talk
show host Jerry Springer and Canadian Prime Minister Brian Mulroney.
To her credit, Medina avoids mockery
while doing justice to the gripping
weirdness of the story. She puts the
Chilean experience center stage, examining tensions between Beer’s vision and the agendas pursued by various hosts and collaborators. Media’s
heart is open to the hopes for a better
world that drove Allende’s revolution
and the faith her characters put in
Beer’s approach, but is not shy about
speaking up when she catches them
exaggerating its actual accomplishments or making contradictory statements. One contribution of her work
is to remind us that computer technology has been in use outside the
U.S. and Western Europe for a long
time, and that its history in the developing world may follow a quite different path.
To me, the most fundamental lesson is that all technology is political
and most new approaches to computing are promoted with utopian fantasies that later come to seem embarrassing. We instantly recognize the
political nature and unhinged ambition of the Cybersym project because
they are alien to our own experience
the rewards of
tackling big topics
in wealthy countries during an economically liberal era. But, as I have
discussed elsewhere, a similarly impractical vision of gigantic, real-time
systems incorporating forecasting
models was a mainstream part of corporate America in the 1960s. 1 Even the
science fiction control room idea was
already established in the business
press. 2 Likewise, the banking industry
first embraced the idea of the “
cash-less society” almost 50 years ago, but
it still retains a futuristic allure. You
may also remember all the predictions that the Internet would transform politics, revitalize democracy,
and solve the problems of U.S. education. Snake oil, utopian dreams, and
science fiction narratives have played
a much more important role in the
adoption of information technology
than we would usually like to admit.
5. the History of
Computing is Maturing
This kind of prize plays an important
role in the development of a field. By
honoring excellence it helps to shape a
canon of exemplary work and to build
a consensus on topics and approaches
of central importance. So what can we
learn about the history of computing
by looking at the books together?
One thing that jumps out is just
how far the field has developed from
its earliest days in the 1970s. The history of computing used to focus on
the history of computers themselves.
While many scholars continue to
look closely at particular machines,
such as ENIAC, there has been an
unmistakable shift from hardware to
applications and from narrow technical histories to broad portrayals of
technologies in their social contexts.
These books, in particular, demonstrate the rewards of tackling big topics of fundamental importance such
as the rise of Silicon Valley or the
rise of computer use within scientific
There has also been a shift in the
kinds of people telling the stories.
Early activity on history of computing
was driven by computer scientists and
pioneers such as Herman Goldstine,
Brian Randell, Bernie Galler, Donald
Knuth, and Nick Metropolis. In contrast, all four prize-winning authors
discussed in this column have Ph.D.’s
in some variety of science and technology studies or history of technology. Two also hold degrees in computer science or electrical engineering.
Three hold faculty positions—one in a
department of science and technology
studies and two within information
schools. None are appointed primarily in history departments or history of
This hiring pattern reflects the
openness of other disciplines to historical scholarship in computing,
but also has a negative impact on
the development of the field as most
of the best scholars have limited opportunities to teach in their specialist
areas or to train doctoral students in
historical research. I recently learned
that Medina’s book is also the first
history of computing book to win the
annual Edelstein prize from the Society for the History of Technology,
which is indisputably a good sign for
the recognition of work on computing by other historical specialists.
Hopefully this column and other historical commitments by the ACM and
IEEE can maintain a similar connection between computer people and
historians. I like to think that Paul
Baran would have approved.
1. Haigh, t. Inventing information systems: the systems
men and the computer, 1950–1968. Business History
Review 75, 1 (2001), 15–61.
2. Widener, W.R. new management concepts: Working and
profitable. Business Automation 15, 8 (1968), 28–34.
Thomas haigh ( firstname.lastname@example.org) is an associate
professor of information studies at the university of
Wisconsin, Milwaukee, and chair of the sIGCIs group
for historians of computing. a guide to other outstanding
historical work is at http://www.sigcis.org/resources.
Copyright held by author.