recting these GPU trials. “We’re always
looking for cost-effective computing
that can run research weather models,” says Govett, who describes GPUs
as the next generation of supercomputing technology. Govett says that, to run
the new weather models, NOAA would
need traditional CPU-based systems
that would cost $75 to $150 million and
would require building special data facilities to accommodate the power and
cooling. “GPU systems with similar capabilities could be built for about $10
million, with no special building needed,” he says.
As an example of the kind of cost
savings involved with GPUs, Govett
cites a test run with a next-generation
model called the nonhydrostatic icosahedral model (NIM). The advanced
computing lab at ESRL, which includes
model developers, code-parallelization
experts, and GPU researchers, calculates that more than 200,000 CPU
cores would be needed to produce a
NIM forecast close enough to real-time
to be useful for prediction. The ESRL
researchers, who began experimenting
with GPUs in 2008, demonstrated that
the NIM model could be run 25 times
more quickly on GPUs than on traditional CPUs. (For an overview of new
developments in high-end computing
with GPUs, see “Supercomputing’s
Exaflop Target” in the August 2011 issue of Communications.)
While GPUs appear to be a promising alternative to traditional supercomputing, several challenges could
prevent them from being adopted for
weather modeling. For one, the code
must be modified to run on GPUs.
Govett and his team have developed
their own compilers to convert their
Fortran code into the language used
on NVIDIA GPUs. As these compilers
mature, Govett explains, the parallelization process will get easier. But, at
least for now, Govett calls the work to
parallelize models to run efficiently
on GPUs a significant challenge. That
code-parallelization difficulty is magnified with the new class of ensemble
modeling that is expected to revolutionize weather prediction.
Single weather models, by themselves, present a significant challenge
to high-performance systems capable
of handling extreme workloads. Large
ensembles consisting of many mem-
a hexagonal grid used by several next-generation global weather models. this
particular grid is based on a 480-kilometer
model. the next generation of weather
models, driven by GPu technology, will be
run at a scale of 2 to 4 kilometers, making
neighborhood-level accuracy possible.
bers (or models with different configu-
rations) generate a range of forecasts
that are then combined to produce a
single more accurate forecast. At the
finest scale needed for accurate weath-
er prediction, these ensembles can be
run only on the fastest supercomput-
ers. “We recently ran a 20-member
ensemble on the Oak Ridge Jaguar
supercomputer, which was the larg-
est supercomputer in the world until
last year,” says Govett. “That model re-
quired over 120,000 CPU cores, or basi-
cally half of the machine, and this was
for one ensemble run.”
Govett says NOAA does not have the
processing power to run large ensem-
ble models at a research level, let alone
at an operational level where the mod-
els are run quickly enough for the pre-
dictions to be useful for forecasting.
“Research computing, and to some ex-
tent climate forecasting, does not have
the time constraint that operational
weather forecasting does,” says Govett.
“In operations, weather models need
to be run quickly or the information
they produce will not be useful, par-
ticularly for severe weather where lives
and property are at risk.”
As for the future of ensemble mod-
eling, Govett says that, by exploiting
the parallelism available in GPU and
multicore systems and rewriting ex-
isting code with new algorithms and
solvers, ensemble models will be able
to run at a global scale and generate
weather and climate predictions with
much better accuracy than can be
achieved today. Despite the ongoing
challenges facing Govett and other
researchers working to refine weather
modeling and find alternatives to tra-
ditional supercomputer-class systems
so the more sophisticated ensemble
models can move from research to op-
erational use, Govett says he remains
optimistic about the years ahead.
“Model prediction continues to im-
prove,” he says. “This is particularly
evident in the improved accuracy of
hurricane track and intensity fore-
casts, severe weather predictions of
flooding and tornadoes, and regional
climate prediction.”
Koch, for his part, says the future
of weather prediction looks bright, in-
deed. But getting to the point where
scientists are able to produce 60-min-
ute warnings for extreme weather, he
says, will be a major undertaking that
will require enough computing power
to run fine-scale ensemble models
at an operational level. “That’s my
dream,” he says. “It may be achievable
within 15 years.”
Further Reading
Govett, M., Middlecoff, J., and Henderson, T.
Running the nIM next-generation weather
model on GPUs, Proceedings of the IEEE/
ACM International Conference on Cluster,
Cloud, and Grid Computing, Melbourne,
Victoria, Australia, May 17–20, 2010.
Henderson, T., Govett, M., Middlecoff, J.,
Madden, P., and Rosinski, J.
Experiences applying Fortran GPU
compilers to numerical weather prediction
models, Proceedings of the 2010
Symposium on Application Accelerators
in High Performance Computing, Knoxville,
Tn, July 13–15, 2010.
Stensrud, D.J., et. al.
Convective-scale warn-on-forecast: A
vision for 2020, Bulletin of the American
Meteorological Society 90, 10, Oct. 2009.
Stensrud, D.J. and Gao, J.
Importance of horizontally inhomogeneous
environmental initial conditions to ensemble
storm-scale radar data assimilation and
very short range forecasts, Monthly Weather
Review 138, 4, April 2010.
Yussouf, N. and Stensrud, D.J.
Impact of high temporal frequency phased
array radar data to storm-scale ensemble
data assimilation using observation system
simulation experiments, Monthly Weather
Review 138, 2, Feb. 2010.
based in los angeles, Kirk L. Kroeker is a freelance
editor and writer specializing in science and technology.
© 2011 aCm 0001-0782/11/11 $10.00