development approaches. The current
key HPC project relies on architectural
innovation, technology breakthroughs,
and hardware and software coordination to address these challenges. Novel
architectures will be explored to address the requirement of the various
applications. Engineering trade-offs
will be necessary to balance metrics
in power consumption, performance,
programmability, and resilience. Technology breakthroughs will be pursued
through comprehensive research efforts. Special attention will target application software.
The Chinese government is encouraging development of the kernel technologies, including high-performance
processor/accelerator, novel memory
devices, and interconnect networks.
The effort toward self-controllable processor technologies include Sunway’s
SW many-core processor, NUDT’s FT series CPU and Matrix series accelerator,
and Sugon’s X86 AMD-licensed processor. NUDT has developed its propriety
interconnect network TH-Net with high
bandwidth and low latency, making the
TH- 2 system efficient and scalable. The
Sunway system also includes its own
self-designed large-scale network, enabling the TaihuLight system to run efficiently on 10 million cores. More new
technologies breakthroughs are still
needed to support successful development of exascale systems.
The key HPC project also targets applications focusing on climate change,
ocean simulation, combustion, elec-tromagnetic-environment simulation,
oil exploration, material science, astrophysics, and life science. A new computational model and algorithm will be
designed, and the efficiency, scalability, reliability of the applications will be
evaluated for future exascale systems.
The pervasive use of HPC has promoted development of large-scale parallel software. Chinese researchers are
strengthening development of system
software and application software for
domestic hardware systems, aiming to
establish the country’s own HPC ecosystem.
Emerging big data and AI applica-
tions have also gained the attention
of Chinese HPC research programs.
The National Natural Science Founda-
tion of China, the counterpart of the
U.S. National Science Foundation, has
launched an initiative in big-data sci-
ence to research computational models,
algorithms, and platforms for data ana-
lytics and processing. Related projects
focus on such big-data-related fields as
video processing, health and medicine,
intelligent transport, finance, govern-
ment administration, and intelligent
education. And an upcoming national
research initiative on AI will call for
HPC support for AI applications. The
scope of HPC applications will definite-
ly broaden in the future.
Parallel computers and parallel applications have cross-pollinated each other
in China for the past 15 years. The availability of leading-class supercomputers
has stimulated the growth of parallel
applications in a number of fields, an
application- and technology-driven-growth trend that will continue into the
How to maintain sustainable development toward the next generation of
supercomputing in China is an open
question. Though significant progress
has been made in recent years, China is
still behind Western countries in HPC
in many respects. A long-term national
plan on HPC is needed that would allow
more systematic deployment of HPC research. A mechanism that would ensure
sustainable development of the national HPC infrastructure must be established so the supercomputing centers
do not have to struggle to find the money needed to run the supercomputers.
Exascale computing projects are
being implemented in the U.S., Japan,
and Europe, aiming to deliver exa-
flops computers in three to five years.
Their effort is like mountain climbing.
Climbers can enjoy the magnificent
scenery only when they get to the top
following their arduous journey. The
Chinese HPC community is willing to
work with the international HPC com-
munity to pursue the goal of exascale
computing, sharing the experience
and jointly attacking the grand chal-
lenges. HPC should not be a new kind
of arms race but technology that ben-
efits all people.
Chinese researchers also need to be
aware of new technologies and applications. The emergence of big data and
AI brings new challenges and opportunities to HPC. Supporting big data and
AI with HPC while being rewarded by
big-data- and AI-enabled technologies
for HPC should drive coordinated and
converged development of all three.
All should take this opportunity to embrace this new exciting era of supercomputing.
Yutong Lu is a professor of data and computer science
at Sun Yat-Sen University and Director of the National
Computing Center in Guangzhou.
Depei Qian is a professor and Dean of Data and Computer
Science at Sun Yat-Sen University, Guangzhou and a
professor of computer science and engineering at Beihang
Haohuan Fu is professor in the Department of Earth
System Science at Tsinghua University, Beijing, and
Deputy Director of the National Supercomputing Center
Wenguang Chen is a professor in the Department of
Computer Science and Technology at Tsinghua University,
© 2018 ACM 0001-0782/18/11 $15.00
The TaihuLight supercomputer is installed at the National Supercomputer Center in Wuxi.