the ideals of science. 4 Despite these
efforts, many university departments
lost their experimentalists and the science vision faded into the background.
In the 1980s, science visionaries
from many fields saw ways to employ
high-performance computers to solve
“grand challenge” problems in science.
They said computing is not only a tool
for science, but also a new method of
thought and discovery in science. (Aha!
Computational thinking!) They defined
computational science as a new branch
of science imbued with this idea. The
leaders of biology, epitomized by 1975
Nobel Laureate David Baltimore, went
further, saying biology had become
an information science and that DNA
translation is a natural information
process. Another biologist, Roseanne
Sension, attributed the efficiency of
photosynthesis to a quantum algorithm
embedded in the cellular structure of
plant leaves (Nature, April 2007). Biologists have thus been leaders in driving nails into the coffin of the “natural
science” argument about computing.
Many other scientists have reached
similar conclusions. They include physicists working with quantum computation and quantum cryptography, chemists working with materials, cognitive
scientists working with brain processes,
economists working with economic
systems, and social scientists working
with networks. 9 All claimed to work with
natural information processes. Stephen
Wolfram went further, arguing that information processes underlie every natural process in the universe. 13
Those two external factors—rise of
computational science and discovery of
natural information processes—have
spawned a science renaissance in computing. Experimental methods have
regained their stature because they are
the only way to understand very complex systems and to discover the limits
of heuristic problem solution methods.
Here is an example of an advance in
algorithms obtained through an empirical approach. In May 2004, an international research group announced it had
computed an optimal tour of 24,978 cities in Sweden (see http://tsp.gatech.edu/
sweden). By iterating back and forth
among several heuristic methods, they
homed in on a provably optimal solution. Their computation took about one
year on a bank of 96 parallel Intel Xeon
2.8GHz processors. With classical tour-enumeration algorithms, which are of
order O(n!), the running time would be
well beyond the remaining age of the
universe. With experimental methods,
algorithm scientists quickly found optimal or near-optimal solutions.
New fields heavily based in experimental methods have opened up—
network science, social network science, design science, data mining, and
Bayesian inference, to name a few. The
widening claims that information processes occur in nature have refuted the
notion that computer science is not
“natural” and have complemented Simon’s arguments that computing is a
science of the artificial.
When is a field a Science?
This brief history suggests that computing began as science, morphed
into engineering for 30 years while it
developed technology, and then entered a science renaissance about 20
years ago. Although computing had
subfields that demonstrated the ideals
of science, computing as a whole has
only recently begun to embrace those
ideals. Some new subfields such as network science, network social science,
design science, and Web science, are
still struggling to establish their credibility as sciences.
What are the criteria for credibility
as science? A few years ago I compiled a
list that included all the traditional ideals of science: 1, 3
˲ Organized to understand, exploit,
and cope with a pervasive phenomenon.
˲ Encompasses natural and artificial
although computing
had subfields that
demonstrated the
ideals of science,
computing as a whole
has only recently
begun to embrace
those ideals.
processes of the phenomenon.
˲Codified structured body of
knowledge.
˲Commitment to experimental
methods for discovery and validation.
˲ Reproducibility of results.
˲Falsifiability of hypotheses and
models.
˲Ability to make reliable predictions, some of which are surprising.
Computing’s original focal phenomenon was information processes
generated by hardware and software.
As computing discovered more and
more natural information processes,
the focus broadened to include “
natural computation.” 9 We can now say
“computing is the study of information
processes, artificial and natural.” 1
Computing is not alone in dealing
with both natural and artificial processes. Biologists, for example, study artifacts including computational models
of DNA translation, the design of organic
memories, and genetically modified
organisms (GMOs). All fields of science
constantly face questions about whether knowledge gained from their artifacts carries over to their natural processes. Computing people face similar
questions—for example, does studying
a software model of a brain yield useful
insights into brain processes? A great
deal of careful experimental work is
needed to answer such questions.
The question of “scienceness” of
computing has always been complicated because of the strong presence of
science, mathematics, and engineering
in the roots and practice of the field. 8, 11
The science perspective focuses on
increasing understanding through experimental methods. The engineering
perspective focuses on designing and
constructing ever-improved computing
systems. The mathematics perspective
focuses on what can be deduced from
accepted statements.
The term “theory” illustrates the
different interpretations that arise in
computing because of these three perspectives. In pure math, theory means
the set of valid deductions from a set
of axioms. In computing, theory more
often means the use of formalism to
advance understanding or design.
effects on the education System
Unfortunately, our education system for young people has not caught