high level of accuracy requires systems
that are too large for a single university.
For example, the nodes processing Schnetter’s modeling code require communication with other nodes several
times per second and a data-exchange
rate of about one gigabyte per second.
“To finish a simulation in a reasonable time, we need to use hundreds
or thousands of cores,” he says. “That
means we need to split the problem
into many pieces, and we need to ensure that each of these pieces remains
as independent from the others as
possible.” By using this modular technique, Schnetter’s team can replace or
exchange grid code if it becomes necessary to apply new physics or use a different hardware architecture.
Another project run on TeraGrid is
an effort to understand the environmental impact of aviation. Conducted
out of Stanford University by doctoral
candidate Alexander Naiman and
overseen by Sanjiva Lele, a professor
in the department of aeronautics and
astronautics, the project models condensation trails, the ice clouds formed
by aircraft emissions. Naiman, whose
research group specializes in computational fluid dynamics and turbulence
simulations, says the difficulty of contrail modeling becomes increasingly
acute as the complexity of the model
increases. “The more complex the
flow, the higher resolution required to
simulate it, and the more resources are
needed,” he says.
While Stanford has local supercomputing resources, they are in high demand. “TeraGrid provides relatively
large and modern supercomputing resources to projects like ours that have
no other supercomputing support,” he
says. The simulation code that Naiman
and his team run on TeraGrid was
written at the Center for Turbulence
Research at Stanford. Naiman says it
was easy to get that code running on
TeraGrid. The research group parallelized the program, a type of large
eddy simulation, using standard mes-sage-passing interface strategies that
Naiman says have been highly scalable
on Tera Grid.
The contrail modeling project is
ongoing, but so far the Stanford team
has simulated the first 20 minutes of
contrail development for several sce-
narios, producing terabytes of three-
dimensional flow fields and other
data, such as time histories for cloud
ice mass. Naiman says that although
the TeraGrid data is still undergoing
analysis, it is likely to help improve
understanding of the development
of contrails and their environmental
impact. “TeraGrid performs as adver-
tised, providing us with CPU hours
that we would not have had access to
otherwise,” he says. “We also take ad-
vantage of the large archival storage
available on TeraGrid to ensure that
important data is backed up.”
improving Grid software
As for the future of research on grid
networks, TeraGrid’s Heinzel says he
remains optimistic, but points out that
improvements must be made in grid
software not only to enhance ease of
use for researchers such as Schnetter
and Naiman, but also to take complete
advantage of new generations of hard-
ware. “You have to be almost a systems
admin to set your parameters on data
movement correctly so you can take
full advantage of these systems,” says
Heinzel. “So the software really needs
to mature.”
Echoing these concerns, LSU’s
Schnetter points out that his research
groups consist of people with widely
varying degrees of supercomputer expe-
rience. “Teaching everybody how to use
the different systems, and staying on
top of what works best on what system,
and which parameters need to be tuned
in what way to achieve the best perfor-
mance, is like herding cats,” he says.
“There are almost no GUIs for super-
computers, and most of the ones that
exist are really bad, so that using them
requires some arcane knowledge.”
Schnetter says he hopes that grid-
based supercomputing will have a
much larger influence on the curricu-
lum than it does today, especially with
so few universities teaching scientific
programming at the level required to
effectively use grid resources. “The
good students in my group learned
programming by themselves, on the
side, because they were interested,”
he says. Still, Schnetter suggests that
such self-taught programming might
not be sustainable in a world in which
computers are becoming increasingly
complex. “I hope that this changes in
the next decade,” he says.
Stanford’s Naiman offers a simi-
lar observation. He says that while his
work on the grid has been positive, the
usability of the technology could be im-
proved. For his part, TeraGrid’s Heinzel
says he remains optimistic about the
accessibility of grids, and predicts that
major usability improvements are on
the way. He likens the evolutionary
pace of these grid developments to
how the Web quickly emerged from the
Internet and now requires little more
than a browser and a basic knowledge
of hyperlinks. “If you know exactly how
to use grid tools, they work effectively,”
says Heinzel. “Now we need to make
them more user-friendly so we can get
a wider audience.”
In the future envisioned by Heinzel,
grids will be manipulated easily by
computer scientists while still provid-
ing friendly interfaces for researchers
coming from other fields. Rather than
predicting that the arrival of such tech-
nologies will take decades or more,
Heinzel says that much progress will
be made in the next few years alone.
“We’re going to see some big improve-
ments in the usability of the grid and
grid software in the next two to four
years,” he says. “Future systems will be
very user-friendly with a high degree
of abstracting the inner workings of
what’s going on from the end users.”
Further Reading
Ferreira, L., Lucchese, F., Yasuda, T., Lee, C. Y.,
Queiroz, C.A., Minetto, E., and Mungioli, A.S.R.
Grid Computing in Research And Education.
IBM Redbooks, Armonk, n Y, 2005.
Magoulès, F.
Fundamentals of Grid Computing: Theory,
Algorithms, and Technologies. Chapman &
hall, Boca Raton, FL, 2009.
Neeman, H., Severini, H., Wu, D.,
and Kantardjieff, K.
Teaching high performance computing via
videoconferencing, ACM Inroads 1, 1, March
2010.
Scavo, T., and Welch, V.
A grid authorization model for science
gateways, International Workshop on Grid
Computing Environments 2007, Reno, nV,
nov. 11, 2007.
Wong, J. (Ed.)
Grid Computing Research Progress. nova
Science Publishers, hauppauge, n Y, 2008.
based in los angeles, Kirk L. Kroeker is a freelance
editor and writer specializing in science and technology.
© 2011 aCm 0001-0782/11/0300 $10.00