Computing Models Advancing as Oil and Gas Industry Demands Change

Oil and Gas_2

The oil and gas industry has long been among the leading practitioners of supercomputing technology, as 3D seismic data processing has been a staple of the sector for years. Recent developments in the industry have added additional layers of complexity and opportunity that promise to improve exploration and production operations – Permanent Reservoir Monitoring, or PRM.

In the past, the development plan for a newly discovered field was based entirely on the original seismic data, observations made by the discovery and subsequent wells. In the previous two decades, it had been demonstrated at more than twenty producing fields around the world that when 3D seismic surveys were repeated two or three times a year it was possible to detect and follow changes in reservoir reflectivity caused by the movement of injection and production fluids. PRM is the integration of time-lapse seismic, or 4D-seismic surveys data, with the ongoing stream of production data and new well information that continuously improves the accuracy of the reservoir model. The end result is a situation in which production companies must deal with a major unstructured big data challenge that requires innovative high-performance computing models to get the job done.

Looking at the dynamics of PRM.

The time interval between the acquisition of the original seismic data, evaluating which lease to bid, the decision where to drill, and finally the discovery and first production, may be a period of five and even ten years for deep-water wells. During that time, new data acquisition methods and data processing algorithms will have been developed that enable the production company to reprocess the original data and achieve even better resolution and greater accuracy in characterizing the target reservoir. Subsequent PRM provides the opportunity to continuously update and improve the reservoir model by correlating measured seismic change with fluid flow, to infer permeability, detect barriers to fluid flow, and to validate geological interpretations, update the petrophysical parameters derived from well logs, cores, and the production history.

In one PRM project, 14 seismic surveys were collected over a ten-year period. The insights gained from the continuous refinement of the reservoir model not only improved production during this period, but also the original estimate of reserves in place doubled twice during the period extending the life of the field by decades. This kind of success requires dedicated interdisciplinary sharing and integration of a multitude of different kinds of data and interpretations.  It is clear that as PRM technology experience grows, the industry must face an incredible data warehousing, management and analysis challenge that depends heavily on advanced computing methods to get the job done.

Understanding the computing challenges of PRM

The unstructured big data analysis required by oil and gas companies is particularly difficult to deal with because it requires organizations to compile, integrate and use data of a variety of data types. This means that different forms of data must be able to integrate with one another to reach end users in the field in a concise and usable manner. The end result is a situation in which clear constraints and parameters are placed on the data sets so they can interact with one another effectively.

On top of all this, the information that is needed on an ongoing basis must also be available along with historic data so that field workers can compare analysis results and use that knowledge to make effective decisions. This means that organizations need to be able to access PRM data gathered over the course of decades at any time. This represents a major storage and analysis challenge that is pushing the boundaries of data warehousing.

Furthermore, the data sets used in PRM are needed for such an extended period of time that organizations must pass them along through diverse technological solutions to ensure they remain viable and available to end users. This creates another layer of integration and analytics challenges that need to be dealt with to support the increasingly prominent trend towards PRM.

What Cray is doing to address oil and gas industry challenges

As a customer-focused supercomputing company, Cray is hard at work developing the HPC systems and storage architectures that oil and gas companies need to support not only challenging seismic processing and reservoir simulation workloads, but also the evolving area of PRM. Join us at the upcoming Society of Exploration Geophysicists International Exposition in Houston where I will be discussing Cray’s Hadoop and Urika appliance technologies and posit how these solutions can be re-purposed to perform data warehousing, processing, and querying suitable for PRM use cases.  You can also learn more about Cray’s efforts in the oil and gas industry here.

Wulf Massell, Chief Geophysicist and Energy Segment Manager

Wulf Massell

Speak Your Mind

*

(won't be published)