Cray Centers of Excellence Help Advance HPC for All

In 2003 Cray signed a contract with Oak Ridge National Laboratory (ORNL) for the installation of a Cray® X1™ supercomputer. This contract led to the creation of a program that continues to support Cray research and development worldwide. The agreement with ORNL included funding for a group of Cray experts to help the U.S. Department of Energy’s Office of Science researchers port and optimize their applications for the new system. This group was called the Cray Supercomputing Center of Excellence (COE). The mission of the Center of Excellence was multifaceted: Assist the DOE’s Office of Science researchers in porting their application from their existing IBM system. Train the researchers and members of ORNL’s Scientific ... [ Read More ]

Australian CIO Honored for Deployment of Cray System

Putting a 1.6 PF supercomputer into production on time and on budget with no interruptions to one of the world’s top meteorological agencies: amazing. And it brought well-earned kudos to Dr. Lesley Seebeck, the chief information officer for Australia’s Bureau of Meteorology (BOM), who was named the 2017 Australian CIO of the year for the federal government sector. iTnews, a publisher of IT-related news and research, presented the award to Dr. Seebeck at its annual awards event in February. According to iTnews, “Seebeck's team correctly predicted several years ago that the agency's high performance computing system would no longer be up to the task of processing BoM's complex climate modelling by 2016. The switch to a new Cray XC40 ... [ Read More ]

Supercomputing in Oil and Gas: Cray CEO Looks Forward

At the 2017 Rice Oil & Gas HPC Conference, Cray CEO Peter Ungaro provided insight into current and future realities for HPC in oil and gas. His presentation, titled “Supercomputing: Yesterday, Today and Tomorrow,” is now available as a video. According to Ungaro, “What got us here today will not get us where we want to be tomorrow. As we start to think about things going forward, a lot of new possibilities open up to us which weren’t really available before. It’s going to require us to think a little differently.” “What got us here today will not get us where we want to be tomorrow." The O&G industry, Ungaro said, is at a transition point in HPC deployment that is characterized by: A transition from a “best-price” ... [ Read More ]

ExxonMobil Sets Record with Cray Supercomputer

ExxonMobil, working with the National Center for Supercomputing Applications (NCSA), has achieved a major breakthrough with proprietary software using more than four times the previous number of processors used on complex oil and gas reservoir simulation models to improve exploration and production results. The breakthrough in parallel simulation used 716,800 processors, the equivalent of harnessing the power of 22,400 computers with 32 processors per computer. ExxonMobil geoscientists and engineers can now make better investment decisions by more efficiently predicting reservoir performance under geological uncertainty to assess a higher volume of alternative development plans in less time. The record run resulted in data output ... [ Read More ]

Why PGS Made the Move from Clusters to Supercomputers

PGS is a marine seismic company that acquires high-resolution seismic data for use in imaging and modeling the earth’s subsurface. When the company launched its Triton survey in the Gulf of Mexico in November 2013, they knew they’d end up with the largest seismic survey they’d ever collected. When they finished acquiring the data in August 2014, they had the most complex imaging challenge they’d ever faced. The high-fold, long-offset, dual-sensor, broadband survey provided full-azimuth data and covered approximately 10,000 square kilometers in a notoriously difficult-to-image area of the Gulf. The result: a 660 terabyte dataset. Triton overcomes deepwater challenges Considered the most revolutionary and technologically advanced ... [ Read More ]