5. 2. accuracy
To quantify the accuracy of force computation on Anton, we
measured the relative rms force error, defined as the rms error in the force on all particles divided by the rms force. 18 For
the DHFR system with typical simulation parameters, Anton
achieves a relative rms force error of 1. 5 × 10− 4. A relative rms
force error below 10− 3 is generally considered sufficiently accurate for biomolecular MD simulations. 25
We also measured energy drift to quantify the overall accuracy of our simulations. An exact MD simulation would
conserve energy exactly. Errors in the simulation generally
lead to an increase in the overall energy of the simulated
system with time, a phenomenon known as energy drift. We
measured energy drift over 5 ns of simulated time ( 2 million
time steps) for DHFR using a bit-accurate numerical emulator that exactly duplicates Anton’s arithmetic. While the
simulation exhibited short-term energy fluctuations of a few
kcal/mol (about 0.001% of the total system energy), there
was no detectable long-term trend in total energy. MD studies are generally considered more than adequate even with a
significantly higher energy drift. 24
5. 3. Scaling with chemical system size
Figure 6 shows the scaling of performance with chemical
system size. Within the range where chemical systems fit in
on-chip memory, we expect performance to scale roughly
linearly with the number of atoms, albeit with occasional
jumps as different operating parameters change to optimize performance while maintaining accuracy. The largest
discontinuity in simulation rate occurs at a system volume
figure 6: Scaling of performance for a 512-node version of Anton
with increasing chemical system size. the graph shows a stacked
bar chart for each chemical system, with the height of each stack
proportional to the simulation time, assuming that long-range forces
are evaluated every other time step. each stack represents the time
required to execute two consecutive time steps; one is a “long-range
time step” that includes calculation of long-range electrostatics by
k-GSe, and the other is a “range-limited time step” that does not. the
chemical systems represent proteins and nucleic acids of various
sizes, surrounded by water.
of approximately 500,000 Å3 when we change from a 32 × 32
× 32 FFT grid to a 64 × 64 × 64 FFT grid, reflecting the fact
that our code supports only power-of-two-length FFTs. This
lengthens the long-range calculation because the number
of grid points increases by a factor of 8. Overall, the results
are consistent with supercomputer scaleup studies—as we
increase chemical system size, Anton’s efficiency improves
because of better overlap of communication and computation, and because calculation pipelines operate closer to
peak efficiency.
6. concLuSion
We are currently in the process of building a specialized,
massively parallel machine, called Anton, for the high-speed
execution of MD simulations. We expect Anton to be capable of simulating the dynamic, atomic-level behavior of proteins and other biological macromolecules in an explicitly
represented solvent environment for periods on the order
of a millisecond—about three orders of magnitude beyond
the reach of current MD simulations. The machine uses specialized ASICs, each of which performs a very large number
of application-specific calculations during each clock cycle.
Novel architectural and algorithmic techniques are used to
minimize intra- and inter-chip communication, providing
an unusually high degree of scalability.
While it contains programmable elements that could in
principle support the parallel execution of algorithms for a
wide range of other applications, Anton was not designed to
function as a general-purpose scientific supercomputer, and
would not in practice be well suited for such a role. Rather,
we envision Anton serving as a computational microscope,
allowing researchers to observe for the first time a wide range
of biologically important structures and processes that have
thus far proven inaccessible to both computational modeling and laboratory experiments.
References
1. Adcock, S. A. and McCammon, J.A.
Molecular dynamics: Survey of
methods for simulating the activity
of proteins. Chemical Review,
106:1589––1615, 2006.
2. Bhatele, A., Kumar, S., Mei, C., Phillips,
J.C., Zheng, G., and Kale, L.V. Overcoming
scaling challenges in biomolecular
simulations across multiple platforms,
to appear in Proceedings of the IEEE
International Parallel and Distributed
Processing Symposium (IPDPS 2008),
Miami, FL, 2008.
3. Bowers, K.J., Chow, E., xu, H., Dror,
R.O., Eastwood, M.P., Gregersen, B.A.,
Klepeis, J.L., Kolossvary, I., Moraes,
M.A., Sacerdoti, F.D., Salmon, J.K.,
Shan, Y., and Shaw, D.E. Scalable
algorithms for molecular dynamics
simulations on commodity clusters.
Proceedings of the ACM//IEEE
Conference on Supercomputing
(SC06), Tampa, FL, 2006.
4. Bowers, K.J., Dror, R.O., and Shaw,
D.E. Zonal methods for the parallel
execution of range-limited N-body
problems. Journal of Computational
Physics, 221( 1):303––329, 2007.
5. Fitch, B.G., Rayshubskiy, A.,
Eleftheriou, M., Ward, T.J. C.,
Giampapa, M. E., Pitman, M. C., Pitera,
J. W., Swope, W. C., and Germain,
R.S. Blue matter: scaling of N-body
simulations to one atom per node.
IBM Journal of Research and
Development, 52(1/2), 2008.
6. Fine, R.D., Dimmler, G., and Levinthal,
C. FASTRUN: A special purpose,
hardwired computer for molecular
simulation. Proteins: Structure,
Function, and Genetics, 11( 4):242––
253, 1991 (erratum: 14( 3):421––422,
1992).
7. Germain, R. S., Fitch, B., Rayshubskiy,
A., Eleftheriou, M., Pitman, M.C.,
Suits, F., Giampapa, M., and Ward,
T. J. C. Blue matter on blue gene/L:
Massively parallel computation
for biomolecular simulation.
Proceedings of the Third IEEE/ACM/
IFIP International Conference on
Hardware/Software Codesign and
System Synthesis (CODES + ISSS
‘05), New York, NY, 2005.
8. Hess, B., Kutzner, C., van der Spoel,
D., and Lindahl, E. GROMACS 4:
Algorithms for highly efficient,
load-balanced, and scalable
molecular simulation. Journal of
Chemical Theory and Computation,
4( 2):435––447, 2008.
9. Jorgensen, W.L., Maxwell, D. S., and
Tirado-Rives, J. Development and
testing of the OPLS all-atom force
field on conformational energetics
and properties of organic liquids.