roscience can then more effectively
transition to novel computational architectures and more efficient use of
silicon’s capabilities.
Knowledge of the Brain
Is Undergoing Its Own
Dramatic Scaling
Several critical efforts are underway
that make such a revolutionary perspective possible. Major government
funded efforts in neuroscience, such
as the BRAIN Initiative in the U.S., the
Human Brain Project in the European
Union, and the China Brain Project,
are focused on studying the brain at a
systems level that they argue will maximize the computational understanding of neural circuits. Several major
non-profit efforts, most notably the
Allen Institute for Brain Sciences and
the Howard Hughes Medical Institute
Janelia Research Campus, similarly
have developed large programs to systematically study neural circuits. As a
specific example, the BRAIN Initiative
has a guiding goal of recording a million neurons simultaneously in awake,
behaving animals.
4 Such a goal would
have been unfathomable only a few
years ago; however, today it increasingly appears within neuroscientists’
grasp. Somewhat ironically, the advances in neuroscience sensors which
allow neuroscientists to measure the
activity of thousands of neurons at a
time have been fueled in large part
by the miniaturization of devices described by Moore’s Law. It has been
noted the increase in numbers of neurons recorded within a single experiment has itself undergone an exponential scaling over recent decades.
36
Similarly, large-scale efforts seek-
ing to reconstruct the “connectome”
of the brain are becoming more com-
mon.
19 In contrast to ANNs, neural
circuits are highly complex and vary
considerably across brain regions and
across organisms. This connectome ef-
fectively represents the graph on which
biological neural computation occurs,
and many neuroscientists argue that
knowing this connectivity is critical
for understanding the wide range of
neural computations performed by
the brain. While the technology to im-
age these large-scale connectomes is
increasingly available, there is a grow-
ing appreciation that challenges sur-
rounding data analysis and storage are
likely to become the limiting factor of
neurotechnology as opposed to sim-
ply achieving higher resolution sensor
technologies.
5, 12
This rise of large-scale neuroscience
efforts focused on high-throughput
characterization of the brain rests on
many decades of substantial progress
in understanding biological neural
circuits, but it is notable that neurosci-
ence’s influence on computation has
been relatively minor. While neural
networks and related methods have
been experiencing a renaissance in
recent years, the advances that led to
deep learning did not derive from nov-
el insights about neurobiological pro-
cessing, but rather from a few key algo-
rithmic advances and the availability
of large-volumes of training data high-
performance computing platforms
such as GPUs.
23
While advances in neuroscience
are not responsible for the recent suc-
cesses in machine learning, there are
reasons that it will be more important
to look to the brain going forward. For
example, the brain may offer novel
computational mechanisms that en-
able the machine learning field to im-
pact domains that still require human
intervention, in the same sense that
Moore’s Law benefited from disrup-
tive shifts in materials science. Two
such areas are the requirements of
current deep learning techniques for
high-quality training data and the ca-
pabilities targeted by machine learn-
ing applications. Of most immediate
concern is the data requirement of
machine learning methods. While
large volumes of data are increasingly
common in many applications, ob-
taining high-quality data—defined by
both well calibrated sensors and effec-
tive annotations—is often incredibly
expensive and time consuming. As a
result, the ability for deep learning-re-
lated methods to impact domains with
inappropriately structured data has
been limited, even in domains where
this is relatively straightforward for hu-
man operators.
More efficient use of data is an area
of intensive machine learning research
today, and has seen some recent im-
provements with regularization tech-
niques such as “dropout”
35 and gen-
erative adversarial networks, or GANs,
New computational
paradigms that
leverage emerging
neuroscience
knowledge
represent
a distinctly new
foundation for
scaling computing
technology
going forward.