pear impressive, that number of neu-
rons is only a tiny fraction of a rodent
cortex; and the diversity of neural re-
gions and complex behaviors suggests
a plethora of algorithms wait to be de-
fined. A major indicator of this prog-
ress will be potential developments in
theoretical neuroscience. Neurosci-
ence has long been a field where the
ability to collect data has constrained
the development of robust neural
theories, and it is a growing hope that
the ability to measure large popula-
tions of neurons simultaneously will
inspire the development of more ad-
vanced neural theories that that pre-
viously would have been dismissed as
that help utilize poorly labeled data,
14
but there are many reasons to believe
that the brain’s approach to maximiz-
ing the utility of observed data in both
developmental and adult learning is a
notable area where brain-inspiration
can dramatically improve computing.
More broadly, it is useful to consider
neuroscience’s impact on computing
capabilities. In general, machine learn-
ing has focused primarily on tasks most
associated with sensory processing in
the brain. The increased knowledge of
neural circuits of non-sensory regions,
such as the hippocampus, pre-frontal
cortex, and striatum, may well provide
opportunities for radically different
approaches to algorithms that provide
computational intelligence.
A Potential Timeline
of Brain-Inspired
Capabilities in Computation
Here, I describe one outlook for neural algorithm “scaling,” wherein the
community benefits from the development of progressively more advanced
brain-like capabilities in algorithms.
This work comes from the perspective
that the rapid increase in available experimental data is a trend that is unlikely to end soon. While the BRAIN
Initiative goal of simultaneously recording one million neurons may ap-
Figure 2. The continued scaling of neural computing need not rely on improved materials, but rather can be achieved by looking elsewhere
within the brain.
Today, we are exploiting advances of conventional ANNs at large scale,
but there are already trends toward more temporal based neural networks such as long short-term memory. We are poised to benefit from a
series of these technological advances, brining neural algorithms closer
to the more sophisticated computational potential of the brain.
Deep Sensory Networks
Temporal Neural Networks
Bayesian Neural Algorithms
Dynamical Memory Algorithms
Cognitive Inference Algorithms
A
dv
anc
ing
N
eura
l
Ca
pab
il
it
y
Time
Self-organizing Algorithms
Algorithm Class Current Algorithms Inspiration Application
Deep Vision
Processing
Deep Convolutional Networks
(VGG, AlexNet, GoogleNet),
HMax, Neocognitron
Hierarchy of sensory nuclei and early
sensory cortices
Temporal Neural
Networks
Deep Recurrent Networks
(e.g., long short-term memory),
Hopfield Networks
Local recurrence of most biological
neural circuits,
especially higher sensory cortices
Dynamic feature extraction (e.g.,
videos, audio) and classification
Bayesian Neural
Algorithms
Predictive Coding,
Hierarchical Temporal Memory,
Recursive Cortical Networks
Substantial reciprocal feedback
between “higher” and “lower”
sensory cortices
Inference across spatial and
temporal scales
Dynamical Memory
and Control
Algorithms
Liquid State Machines,
Echo State Networks,
Neural Engineering Framework
Continual dynamics of hippocampus,
cerebellum, and prefrontal and
motor cortices
Online learning content-
addressable memory and adaptive
motor control
Cognitive Inference
Algorithms
Reinforcement learning
(e.g., Deep Q-learning)
Neural Turing Machines
Integration of multiple modalities
and memory into prefrontal cortex,
which provides top-down influence
on sensory processing
Context and experience
dependent information processing
and decision making
Self-organizing
Algorithms Neurogenesis Deep Learning
Initial development and continuous
refinement of neural circuits to
specific input and outputs
Automated neural algorithm
development for unknown input
and output transformations
Static feature extraction (e.g.,
images) and pattern classification