Exascale Computing for Seismic Exploration

The exascale race — toward machines capable of executing 1 quintillion (1018) operations per second — is well under way. The U.S., Japan, China and Europe have all established programs to enable exascale capabilities by around 2020.

In the U.S., leading the charge toward significant HPC milestones usually falls to the Department of Energy (DOE): Both its Office of Science and the National Nuclear Security Administration have significant interest in and benefit from ever-increasing levels of modeling, simulation and data-processing capabilities for a variety of applications. Both can leverage critical resources (facilities and people), institutional expertise and deep know-how, if not intellectual property, in this pursuit.

Not all that far behind the DOE in pursuing exascale computing — and possibly one of the greatest benefactors of the effort — is a major commercial segment (and voracious consumer of data and processing capabilities): integrated oil and gas (O&G) companies (IOCs), especially those with a substantial presence in the Gulf of Mexico.

The business case for exascale in O&G is extremely compelling, and — as anyone who has read Daniel Yergin’s “The Prize” will appreciate — goes to the very core of why IOCs exist. In the search for oil and gas in the Gulf of Mexico — one of the richest hydrocarbon basins in the world that continues to reinvent itself for exploration plays — the biggest prizes lie in ultra-deep water. In a deeply submerged area about 300 miles southwest of New Orleans and extending into Mexico waters, rock formations from the Paleogene period, also known as the Lower Tertiary, represent the leading edge of deep-water oil discovery. Down there (in fact, over 30,000 feet down there), the rocks are hot and the oil-bearing sands are high-pressure reservoirs buried under very thick salt and possibly sub-basalt layers.

Exploring for these reservoirs has become an incredibly technology-dependent business in which the acquisition of seismic data (reflected sound waves) plays a crucial role. Indeed, modern surveys can cover thousands of square kilometers of ocean surface, use up to three vessels producing sound waves, include five support vessels, each towing kilometers of multiple streamers with thousands of listening devices, and take months to complete. (Case in point: the Triton survey from PGS.) The petabytes of scientific big data acquired in this process must be processed so that geophysicists and geologists can deduce an accurate picture of the subsurface and determine if it’s worthwhile to drill an exploration well.

The stakes are high: Being wrong and drilling a dry hole is estimated to cost $100 million. The technical challenges are massive: Because the rocks are so deep, sound wave reflections are extremely weak. Because the formations are so complicated, it’s hard to figure out exactly where those very weak reflections are coming from. And yet fidelity and resolution of fault lines and traps is critical. Here is where powerful, novel processing schemes meet powerful next-generation computers.

Seismic Depth Imagin Methods & HPC evolutionIn a pioneering review of the requirements of seismic processing at the Rice HPC Forum  (a representative forum where fundamental research from academia meets with applied research from the IOCs )  in 2011, Calandra, Etgen and Morton presented this chart showing where the industry is going and how it tracks the HPC roadmap. First of, while the Figure has its vertical axis in flops, it was immediately recognized that seismic processing is really (exa)scale computing, not just (exa)flops — that is, it’s a merger of big data and big compute. Second is the inflection point in the processing demand and requirements that occurred  in the early 2000s when the O&G industry moved to medium model-complexity in immediate response to the HPC industry breaking the petascale barrier; the latter mostly enabled by many-core co-processors and accelerators. This is validated later on, around 2010, with the rapid adoption of new algorithms (in an industry that is otherwise rather conservative in replacing or complementing the tried-and-true with the novel) such as reverse time migration and  full waveform inversion. Every indication is that the industry continues to innovate on the algorithmic front and continues to track the HPC technology curve.

Given these three trends — a major push and funded initiative toward enabling technologies for exascale, a compelling business case and a recent inflection point in algorithmic capabilities — I’d say we have a perfect storm in seismic processing requirements. I expect the IOCs may surprise us with their rapid adoption and deployment of exascale computing as it becomes available.

Speak Your Mind

Your email address will not be published. Required fields are marked *