Exascale Can’t Wait: Cray for Oil & Gas in the Microsoft Azure Cloud

A path to exascale in upstream oil and gas processing with Cray in the Microsoft Azure cloud

At the 11th annual Rice Oil and Gas HPC Conference (March 12-13), a premier industry event, many presentations will address the (mostly US) exascale roadmap with emphasis on algorithm development and applications enablement to make this program especially relevant in oil and gas.

This author has long stated how “a perfect storm in seismic processing requirements is ensuring that the O&G industry will be an early adopter of exascale computing technologies.” That was in 2014, before the great oil crash. For a long while this author feared he was going to have to eat crow. After all, how much supercomputing power is needed to run a shale play? Remarkably, though, four years later and despite deep cost-cutting and the continuing, if not increasing, importance of the unconventionals, the industry has reacted in ways to make this message all the more relevant.

First of all, HPC in oil and gas is far from going away and continues to have bottom-line impact. Data acquisition at scale — applied to emerging as well as established imaging and inversion algorithms — will continue to fuel demand for more processing, especially in ultra-deep water. Industry analysts called the recent Brazil pre-salt rounds a “triumphant comeback” and at least one major oil company, BP, has stoutly quantified the impact of advanced imaging in additional barrels of oil recovered (and revenue impact) in a Gulf-of-Mexico field. 2017 was the year that reservoir simulators, long thought impervious to massively parallel and/or accelerated computing, became true HPC applications. Furthermore, even in the unconventionals, the importance of multiscale and multiresolution modeling of shales and their flow and morphological properties is being recognized. And in the same unconventional resources, companies like Total are applying recent machine learning techniques to well prediction and plan to use their massive computing power to assess even more complex applications on even more data.

For many oil and gas companies, exascale truly can’t wait!

But the investments needed to productively deploy exascale are substantial, and exascale is still a few years away. Furthermore, early deployment of exascale will almost surely be “bleeding edge.” So, what to do in the meantime to get the full benefits of the supercomputing (r)evolution?

Cray is putting serious efforts into Cray in Azure. A dedicated, fully customized and managed Cray supercomputer with very high bandwidth, capacity storage and PFS solutions, which is incredibly important in the oil and gas segment — will be available in Microsoft Azure with integrated high-speed, low-latency connectivity to the Azure datacenter. In addition, a fully optimized, highly scalable compute software stack addresses the exponential demand for compute capability and real-time insights needed by our oil and gas customers. Here, we mean to take services to the cloud that before were impossible to move off the ground (by virtue of their large repositories and/or requirements for dedicated, tightly coupled architecture), including high-resource and complex workloads such as supercomputing.

Equally important, with Cray in Azure, we’re also preparing ourselves to expect the unexpected. We believe that our customers will provide us with previously unimaginable scenarios in upstream processing that leverage the Azure cloud, its datacenter capabilities, rich ecosystem and silo busting capabilities

Intriguing too is the enablement of workloads that combine HPC, advanced analytics and perhaps even IoT at scale. In fact, the unconventionals are very data-rich and are constantly monitored, and shale companies are looking to this new one-cloud idea to help them tame their substantial IoT data, advanced predictive analytics and data storage.

Cray in Azure is a groundbreaking effort to extend the capabilities of cloud to reach the top-end performance levels needed by the largest workloads. In doing so, a whole new path to exascale may open up: one where it is possible for oil and gas companies to cost-effectively run their current high-end processing suites in support of exascale and, ultimately, test and benchmark new HPC and AI hardware and software before investing their capex and labor dollars.

If you’re attending the Rice Oil & Gas HPC Conference, please stop by and visit us at the event to learn more about Cray in Azure.


  1. 2

    Geert C WenesGeert C Wenes says

    The power of compound interest – read processing performance — is the most powerful force in the universe.
    (with permission from A. Einstein!)

  2. 3

    Doug says

    Love it Geert! We’ve come a long way from unrolling Fortran loops to assembly language to run on IBM mainframe vector features to do seismic data processing. Which is where my IT career started.

Speak Your Mind

Your email address will not be published. Required fields are marked *