IWAS FORTUNATE to enter com- puting in the era of site fund- ing by the Defense Advanced Research Projects Admin- istration (DARPA). “DARPA
sites” from 1960s through the mid-
1990s had sustained investment of
$10M/year (inflation adjusted). The
talent and vision combined with sustained funding at scale enabled the
undertaking of bold transformative
ideas, such as timesharing, entire
new operating systems, novel computing system architectures, new
models of networking, and a raft of
exciting artificial intelligence technologies—in robotics, self-driving
cars, computer vision, and more.
For example, I joined Arvind’s
Tagged-token Dataflow Computing
project as an MIT graduate student.
This single project, which involved
a dozen graduate and undergraduates plus staff and faculty, garnered
~$13.5M support with a run rate exceeding $4M/year.
Why is large-scale funding important? It enables examination of
larger questions that cross problem
spaces, systems, abstractions, even
fields. Yes, we have open source software. Yes, we can build on large-scale
frameworks and software systems.
Yes, we can compose services and leverage the cloud. But it is important to
recognize that leverage of such infrastructures often means that rethinking or reimagining them is beyond the
scope of inquiry. I believe the need for
such large investments has never been
greater. We are seeing:
˲ Transformative change in architecture, operating systems, programming languages, and applications.
New large-scale geographic and distributed structures—datacenters,
edge, home, undersea, orbital systems and soon outer space.
˲New applications from computing, data, intelligence, and trusted
algorithmic execution (for example,
blockchain) melding with a variety of
fields—law, business, in new ways with
accelerating benefit to society.
˲Information technology in the
body politic, journalism, and social
fabric writ large—“fake news,” “social
network addiction and depression,”
and “societal manipulation.”
In the past decade, we have seen
extraordinary new capabilities. GPU
computing has enabled new levels of
energy efficient, high performance,
but only with radical software change
(and of course architecture). The re-
emergence of neuromorphic com-
puting (including deep learning)
delivered new artificial intelligence
capabilities to a broad array of applications. And visionaries suggest many
more eruptions are coming.
Numerous National Academy and
committee reports have called for
research “big bets” based on need
and opportunity. And in late 2007,
during my service on the NSF CISE
Advisory Committee, then-CISE AD
Jeannette Wing initiated the NSF
CISE Expeditions program to create
2 A bright spot! But with a
much smaller investment than prior
programs, funding only 1.5 projects
per year (each ~$10M/five years).
Expeditions are terrific projects,
but typically split over several sites,
diluting infrastructure, capability,
and perspective. For example, a 2018
Expedition Award, “EPiQC: Enabling
Practical-Scale Quantum Computation,” led by my colleague Fred Chong
at the University of Chicago, includes
Given industrial-scale R&D, does
computing need large-scale government research investments? Yes!
Academic research has fundamental
advantages for society, including
education and broad idea and technol-
ogy dissemination; openness to a wide
range of possibilities and directions
(not just “our business position”); vetted
and publicly examined rigorously
from a scientific point of view (for
example, bias in algorithms, security
architectures, privacy and exploita-
tion); and vetted and publicly exam-
ined from a broad societal perspec-
tive (for example, social media,
So what would I propose?
Perhaps a new program. Perhaps a
doubling, and doubling again of the Expeditions program. Projects of twice the
scale ($20M over five years) with mechanisms to ensure they are concentrated
at no more than two institutions. And
doubling the number of such efforts to
three new starts every year. Too expensive? Such a program would be less than
1/5 of 1% of the NIH’s annual budget,
and 1/100 of 1% of the U.S. Department
of Defense budget.
Perhaps if we want to increase global
economic growth beyond what economists call “structural limits,” we need
more disruptive computing advances.
Andrew A. Chien, EDITOR-IN-CHIEF
Andrew A. Chien is the William Eckhardt Distinguished
Service Professor in the Department of Computer Science
at the University of Chicago, Director of the CERES Center
for Unstoppable Computing, and a Senior Scientist at
Argonne National Laboratory.
1. Conte, T.M., DeBenedictis, E.P., Gargini, P.A., and Track,
E. Rebooting computing: The road ahead. Computer
50, 1 (2017), 20–29.
2. NSF Expeditions in Computing Program; www.nsf.gov/
Copyright held by author.
DOI: 10.1145/3192027 Andrew A. Chien