NERSC’s “Edison” Unleashed

resizedimage700346-Edison-head-on

Now that our new flagship supercomputer Edison, a 2.5-petaflop Cray® XC30™, has been in full production mode for a couple of months, it seems like a good time to check in and see how scientists are using it.

At the top of the list of hours used are teams of scientists studying the fundamentals of the standard model of particle physics, the structure of the Earth’s subsurface, clustering of matter in the early universe, fusion energy, clean combustion, how salts bind to water, nano-characteristics of catalysts, table-top accelerators, carbon sequestration, and extreme climate events.

If that seems like a diverse and intriguing array of topics, that’s because it is.  NERSC, the National Energy Research Scientific Computing Center, is the production supercomputing and data storage facility for the U.S. Department of Energy’s Office of Science and provides services and resources for about 5,000 research scientists from around the U.S. and beyond. Most of the research is funded by the Office of Science, which is the largest U.S. supporter of basic research in the physical sciences.

Edison derives it computational power from 133,824 Intel® “Ivy Bridge” Xeon® compute cores running at 2.4 gigaherz (GHz), the Aries network interconnect, and 64 gigabytes (GB) of memory per 24-core node. With these characteristics, Edison nicely augments our Cray® XE6™ supercomputer “Hopper” which has 2.1 GHz AMD “Magny Cours” processors. To most NERSC users, Edison appears as a more powerful, larger memory version of the familiar Hopper. Because of that, many users were able to quickly rebuild their codes on Edison and start running right away.

“Our code runs about twice as fast on Edison as on Hopper,” says University of Utah physicist Carleton DeTar. University of Arizona physicist Doug Toussaint agrees: “The comparisons I did generally showed a better than factor of two increase in speed in going from Hopper to Edison.”

DeTar and Toussaint are running Lattice Quantum Chromodynamics (LQCD) simulations that will make precise predictions based on the Standard Model of particle physics. These predictions can then be compared to experimental results from particle colliders like those from the Large Hadron Collider to make precise comparisons between theory and nature. The Standard Model has been spectacularly successful over the years describing the fundamental nature of our universe and the particle interactions that take place within it. But there are growing hints from cosmology and particle experiments that there might be physics beyond the Standard Model — a possible scientific revolution in the making. Simulations like those being performed by groups led by Paul Mackenzie from Fermi National Accelerator Laboratory and Toussaint play an important role in this research. Combined, the LQCD community has used more than 21 million core-hours on Edison already this year.

LQCD is not the largest consumer of Edison cycles, however. Fusion energy and materials science each have used a few million more, and chemistry and astrophysics are not far behind.

Most of these teams are taking advantage of the size of Edison and the Aries network as exhibited by the number of compute cores their parallel jobs use concurrently. Our users are free to run jobs of any size and they tend to find a “sweet spot” that maximizes their scientific productivity given their yearly allocation of time from DOE. As shown below, the top research teams are using from tens of thousands to more than 100,000 cores, a testament to the scaling capabilities of the Aries network.

Here is a summary of the top 10 projects and their principal investigators on Edison from January 14 to March 7, 2014.

Area Top Projects Jobs Code Website
LATTICE QCD Charm-quark and bottom-quark mass tuning for lattice QCD heavy-light physics, Paul Mackenzie, FermiLab; Quantum Chromodynamics with four flavors of dynamical quarks, Doug Toussaint, University of Arizona Largest: 49,152 cores;  Typical jobs: 6,144 or 24,576 cores MILC Website
Area Top Project Jobs Code Website
GEOPHYSICS Large Scale 3-D Geophysical Inversion and Imaging, Greg Newman, Berkeley Lab Largest: 25,992 cores; Typical job: 17,064 cores EMGeo Website
Area Top Projects Jobs Codes Website
ASTROPHYSICS  Simulating Cosmological Lyman-alpha Forest, Zarija Lukic, Berkeley Lab; Synthetic Spectra of Astrophysical Objects, Edward Baron, University of Oklahoma Largest: 76,800 cores; Typical job: 49,152 or 76,800 cores

Nyx, phoenix Website
Area Top Project Jobs Code Website
FUSION ENERGY RESEARCH  Center for Edge Physics Simulation, Choong-Seock Chang, Princeton Plasma Physics Laboratory Largest: 98,304 cores; Typical job: 19,200 cores XGC Website
Area Top Projects Jobs Codes Website
COMBUSTION RESEARCH Direct Numerical Simulations of Clean and Efficient Combustion with Alternative Fuels, Jacqueline Chen, Sandia National Laboratories; Simulation and Analysis of Reacting Flows, John Bell, Berkeley Lab Largest: 100,248 cores; Typical job: 25,608 and 10,512 cores S3D, LMC3D Website

Website

Area Top Project Jobs Code Website
CHEMISTRY Advanced Modeling of Ions in Solutions, on Surfaces, and in Biological Environments, Michael L. Klein, Temple University 1,056 cores: Typical job: 528 cores Quantum Espresso Website
Area Top Project Jobs Code Website
MATERIAL SCIENCES DFT analysis of energetics for nanostructure formation and catalysis on metal surfaces, Da-Jiang Liu, Ames Laboratory Iowa State University Largest: 37,200 cores; Typical job: 37,200 cores VASP Website


So far in 2014, Edison has proven to be a dependable, efficient, highly usable workhorse. It’s been used by more than 600 scientists working on more than 300 different projects. Given past history (NERSC claims association with a number of Nobel prizes and numerous “breakthroughs of the year”), Edison may right now be contributing to the next big scientific accomplishment. For a real-time list of top projects computing at NERSC, visit our home page at http://www.nersc.gov/.

Richard Gerber, NERSC Senior Science Advisor and User Services Group Lead

 Richard-Gerber

 

 

Speak Your Mind

*

(won't be published)