Vie w p o i n t Peter A. Freeman
Back to Experimentation
Three forward-looking projects depend on experimentation under
real-world conditions.
Some of us in the computing
field have been around long
enough to start to see the
adage “History repeats itself”
come true in the way we produce
major advances in computing.
Here, I want to note the recent
emergence of serious experimentation on a scale not seen in years1 and relate it to
what some of us participated in when the field was
young.
Since the late 1990s, a number of proposals and
initial attempts have sought to develop and experiment with new technology under real-world, but
nonproduction, conditions, on a scale relative2 to the
desired result not seen since CACM was young.
Three such efforts point the way toward a renewed
and very valuable trend of experimentation.
The most visible is the Defense Advanced Research
Projects Agency-supported effort to build and operate
robotic vehicles capable of driving themselves under
demanding, real-world conditions. It has succeeded,
not only operationally, but also in engaging the effort
and imaginations of hundreds, perhaps thousands, of
researchers and students, as well as the general public
( en.wikipedia.org/wiki/ Darpa_grand_challenge). It is
for roboticists to evaluate the technical results, but
from my perspective it has been a great success in
LISA HANE Y
1Despite serious experimentation in computing research (reflected in the special section “Experimental Computer Science,” November 2007), from my perspective as a
professor, we have not insisted on enough experimentation.
2“Relative” is the operant idea here. Most experimentation so far has been only a fraction of what a “fieldable” product or system might be, thus leaving open the question
of scalability. One might argue, only slightly gratuitously, that some large government
projects have indeed been “experiments”; unfortunately, they are rarely intended to be
experiments, nor is much learned from the attempt in many cases.
helping us all set our sights on what can be achieved
through experimentation at scale.
The second, just starting to do some preliminary
prototyping after extensive planning, is the Global
Environment for Network Innovations Project
( www.geni.net) begun in 2004 by the National Science Foundation’s Directorate for Computer & Information Science & Engineering. GENI intends to
refocus networking research on new architectures and
mechanisms for future networks, not just on developing patches for our current networks. The project’s
Web site, which describes GENI and provides pointers to related information, is maintained by the
GENI Project Office operated by BBN Technologies
under agreement with NSF. GENI will support such
research with a large-scale, experimental network that
will be the largest experimental piece of “equipment”
built solely for computer science research. It is not yet
well known outside the computing research community, though such mainstream publications as The
New York Times and The Economist have covered its
progress. Meanwhile, it has already spurred networking and related research (including computer science
theory and communications theory) and major
responses from Europe (www.future-internet.eu/) and
Japan (seen only in news reports at the time of this
writing3).
The third effort—called by some “data-intensive
supercomputing”—is still largely at the talking stage
though appears to be gaining momentum (report-
sarchive.adm.cs.cmu.edu/anon/2007/abstracts/07-
128.ht ml). Based on the idea that the massive,
3The Japanese Minister of Technology was widely quoted last summer
( www.newlaunches.com/archives_japan_working_to_replace_the_internet.php), though
he left office soon thereafter; plans are still being prepared.
COMMUNICATIONS OF THE ACM January 2008/Vol. 51, No. 1