Supercomputing Powers Clean Wind Energy

Dr. Lawrence Cheung and the team from GE Global Research are maximizing the power production of clean energy wind farms. As a clean, renewable energy source, wind power is unbeatable. Research studies show that wind, harnessed effectively, could meet all the world’s energy demands. But a gap still exists between potential and reality ... a gap that Lawrence Cheung and GE Global Research are working on closing. “We’re trying to understand what the wind is doing around the turbines, why it might not be getting enough power here or why it’s not efficient there." Dr. Cheung is a lead mechanical engineer at GE Global Research Center’s Aerothermal discipline. There, he studies wind and noise — elements critical to the design of wind ... [ Read More ]

How the Met Office Solved a Weather Forecasting Runtime Scare

When the Met Office chose Cray to supply three large XC40 supercomputers, Cray’s CEO, Peter Ungaro, made a bold statement. He said, "You will be installing the largest operational systems in the world. There will be problems at scale that you won't have anticipated. Partnership with Cray will allow you to access our deep expertise and solve these problems." Recent ambitious upgrades to the Met Office's forecasting codes have meant we have been able to test this claim. Operational weather forecasting is unlike many other areas of science because it is critical that the computer models run to a strict time schedule. A forecast that takes too long to run isn’t available in time for customers to make decisions, and so is worthless. The Met ... [ Read More ]

Exascale Can’t Wait: Cray for Oil & Gas in the Microsoft Azure Cloud

A path to exascale in upstream oil and gas processing with Cray in the Microsoft Azure cloud At the 11th annual Rice Oil and Gas HPC Conference (March 12-13), a premier industry event, many presentations will address the (mostly US) exascale roadmap with emphasis on algorithm development and applications enablement to make this program especially relevant in oil and gas. This author has long stated how “a perfect storm in seismic processing requirements is ensuring that the O&G industry will be an early adopter of exascale computing technologies.” That was in 2014, before the great oil crash. For a long while this author feared he was going to have to eat crow. After all, how much supercomputing power is needed to run a shale play? ... [ Read More ]

Building a Computing Architecture for Drug Discovery

We recently had the pleasure of helping Jason Roszik and his colleagues at the University of Texas MD Anderson Cancer Center in developing a high-throughput architecture supporting their work in identifying combination therapies for cancer. This work sits at the interface of some major technology, processing and clinical trends, and it was quite an eye-opener — as well as a motivation — for us on how to use Cray-developed systems and processing technologies to build a useful and productive high-throughput IT architecture. The first trend, of course, is next-generation sequencing (NGS). Costs are going down and sequencing throughput is going up dramatically, to where today’s NGS companies state they can process tens of human genomes a ... [ Read More ]

Deep Learning at Scale with NERSC’s “Cori” Supercomputer

Cray is synonymous with large-scale computing. While this is by no means all we do, the research that leverages these large systems is interesting to examine as leading indicators of things to come in the broader community. This has been true for a long time in modeling and simulation, and we’re now beginning to see the same advances in the area of deep learning. For example, the Cray® XC™ “Cori” supercomputer at the National Energy Research Scientific Computing Center (NERSC) is an amazing system, sitting at #8 on the most recent Top500 list of the world’s fastest supercomputers. Of course, it’s the organization that surrounds Cori that enables the use of such technology. To this end and to help further the use of Cori for ... [ Read More ]