Computing Is History
Reflections on the past to inform the future.
tations. Similarly, Denning and Martell
look beyond the 42 ACM-recognized
computing domains, such as security,
programming languages, graphics or
artificial intelligence, to discern common principles that guide or constrain
“how we manipulate matter and energy
to perform computations,” their apt description of the field. For each of their
six principles—communication, computation, coordination, recollection,
evaluation, and design—historical
cases and historical figures shape their
exposition. Communication is Claude
Shannon, Harry Nyquist, Richard Hamming. These are historical principles.
In Great Principles the closer the authors get to cutting-edge science, the
less their findings resemble the sci-ence-fair model of hypothesis, data collection, and analysis. They start from
WITH CLOUD, BIG data, supercomputing, and social media, it’s clear that computing has an eye on the future. But
these days the computing profession
also has an unusual engagement with
history. Three recent books articulating the core principles or essential nature of computing place the field firmly
in history. Purdue University has just
published an account of its pioneering effort in computer science.
Babbage, and Lovelace are in the news,
with bicentennial celebrations in the
works. Communications readers have
been captivated by a specialist debate
over the shape and emphasis of computing’s proper history.a And concerning the ACM’s role in these vital discussions, our organization is well situated
with an active History Committee and
full visibility in the arenas that matter.
Perhaps computing’s highly visible
role in influencing the economy, reshaping national defense and security,
and creating an all-embracing virtual
reality has prompted some soul searching. Clearly, computing has changed
the world—but where has it come
from? And where might it be taking us?
The tantalizing question whether computing is best considered a branch of
the mathematical sciences, one of the
engineering disciplines, or a science
in its own right remains unsolved. History moves to center stage according to
Subrata Dasgupta’s It Began with Babbage: The Genesis of Computer Science.
a Downloads exceed 114,000 for Thomas
Haigh’s Historical Reflections column “The
Tears of Donald Knuth,” Commun. ACM 58, 1
(Jan. 2015), 40–44, as of August 26, 2015.
Dasgupta began his personal engage-
ment with history in conversation with
Maurice Wilkes and David Wheeler.
Babbage, Lovelace, Hollerith, Zuse,
Aiken, Turing, and von Neumann,
among others, loom large in his pages.
Two recent books further suggest
that computing is historically ground-
ed. Peter Denning and Craig Martell’s
Great Principles of Computing2 builds on
Denning’s 30-year quest to identify and
codify “principles” as the essence of
computing. The authors readily grant
the origins of the Association for Com-
puting Machinery, initially coupled
to the study and analysis of comput-
ing machines. In their perspective on
computing as science, they approvingly
quote Edsger Dijkstra’s quip “comput-
er science is no more about computers
than astronomy is about telescopes.”
Dijkstra and others in the founding
generation closely connected to studies
in logic, computability, and numerical
analysis naturally saw computing as a
mathematical or theoretical endeavor
and resisted a focus on engineering
questions and technological manifes-
legacy is of
today with the
expansion of the
A.M. Turing Award.