news
I
M
A
G
E
C
O
U
R
T
E
S
Y
O
F
A
R
G
O
N
N
E
N
A
T
I
O
N
A
L
L
A
B
O
R
A
T
O
R
Y
is becoming more readily available
from vast repositories, and analytics
and machine learning tools are making it possible to analyze the data and
make better sense of it.
Says Kibbe, “There is ever-better
instrumentation and data acquisition
The National Cancer Institute (NCI)
and the U.S. Department of Energy (DOE)
are collaborating on three pilot projects
that involve using more intense high-
performance computing at the exascale
level, which is the push toward making
a billion billion calculations per second
(or 50 times faster than today’s supercom-
puters), also known as exaFLOPS (a quin-
tillion, 1018, floating point operations
per second). The goal is to take years of
data and crunch it to come up with bet-
ter, more effective cancer treatments.
The DOE had been working on building computing infrastructure capable
of handling big data and entered into
discussions with the NCI, which houses massive amounts of patient data.
The two organizations realized there
were synergies between their efforts
and that they should collaborate.
The time is right for this particular
collaboration because of the application of advanced technologies like
next-generation sequencing, says Warren Kibbe, director of the NCI Center
for Biomedical Informatics and Information Technology. In addition, data
Combating
Cancer With Data
Supercomputers will sift massive amounts
of data in search of therapies that work.
Science | DOI:10.1145/3057735 Esther Shein
Researchers used scanning electron microscope images of nanometers-thick mouse brain
slices to reconstruct cells into a neocortex structure (center), whose various cell types
appear in different colors.
N