What Next for Advanced Analytics?


It would be difficult to argue with the success that upstream exploration companies have enjoyed in the past century. But is that an argument against change? After all, given that success, why should companies continue to innovate and adapt? The business models in practice years ago are rapidly shifting due to the influx of new technology and data. In a way, the past has set the stage for the next act, as a prologue does in a play.

We have moved effortlessly from empirical models that embraced our thinking centuries ago to a theoretical manipulation of data, designing models that incorporate qualified generalizations. Such luminaries as Sir Isaac Newton, Johannes Kepler and James Clerk Maxwell made enormous contributions to our understanding of Mother Nature that, by extension, enabled the geoscientific community to grasp fundamentals that underpin physics and mathematics. These fundamentals reflect the heterogeneous complexity inherent in hydrocarbon reservoirs.

Only a few decades have passed since we strolled through the computational branch of science that witnessed the simulation of complex systems, edging towards the current landscape sculpted by a data intensive exploratory analysis, building models that are data-driven. Let the data relate the story and let history influence the context for the present as we combat data overload, marrying first principles and stochastic modeling methodologies to ask more questions and harvest more insight.

Moving forward, advanced analytics uses more data in near-real time, enabling rapid decision making at the point of operations. Oil and gas business models continue to change and capitalize on the next evolution of analytics. Three areas of the upstream business are ripe for this transformation: subsurface dynamics, drilling automation, and reservoir management.

  • Subsurface Mechanical and Fluid Dynamics

In subsurface fluid mechanics, there exists a dichotomy as to the description of forces driving subsurface fluids. On one hand, there is the application of velocity potentials [energy/unit volume] within engineering hydraulics, including reservoir engineering. On the other hand, A. Scheidegger stated unequivocally: “It is a force potential and not a velocity potential which governs flow-through porous media.” As a result, the industry has adopted a number of simplifications in geometry and assumptions regarding the properties of oil, gas and water, as well as simplifications of Darcy’s equation, to find reasonable answers to practical problems by making use of analytical equations. But time and again, seasoned professionals are challenged by the complexities inherent in reservoir modeling. Data-driven methodologies are gaining momentum as a complement to first principles equations. This is largely due to the accelerated increase in data sources and data volumes that surface hidden knowledge if mined by advanced analytical workflows.

Pic 1Reservoir simulation and modeling is ostensibly a bottom-up methodology initiated by the development of a geo-cellular reservoir model. Implementing deterministic modeling and geo-statistical data manipulation enhances the geo-cellular model by integrating the most optimal and most current petrophysical and geophysical knowledge. Then, adding engineering fluid flow first principles is an attempt to resolve numerically a dynamic reservoir model.

Shahab Mohaghegh et al have proposed a Top-Down Intelligent Reservoir Modeling (TDIRM) methodology that starts with gathering intelligence from field measurements such as production data enhanced and calibrated by core, well logs and seismic data. This approach is not intended as a substitute for the conventional reservoir modeling workflows but as a rich, complementary data-driven perspective. TDIRM embraces artificial intelligence and data mining workflows such as neural networks and fuzzy pattern recognition. It offers an ease of usage that predicates short development cycles and minimum data sets to shed light and supplement traditional reservoir management techniques.

  • Drilling Automation

This same approach can be applied to the process of drilling a wellbore. Current initiatives in this field are based on the development of automated drilling solutions that manage the efficiency of the process while mitigating risks to rig personnel, the environment, and the assets. To date, much of the effort has focused on developing solutions to control drilling equipment. However, drilling is a complex operation that aggregates information from both surface and down-hole sensors to optimize performance and mitigate risk. Additionally, many of the rig’s physical elements – such as the hoisting systems – are stochastic in nature. There is no guarantee that control parameters will match exactly with system output.

Pic 2In order to develop prescriptive solutions, these various levels of complexity need to be integrated within an intelligent analytics platform that provides advisory output and associated levels of confidence in the outcome. High speed, low-latency event stream processing solutions can deploy complex analytical models in the sub-second environment required by drilling control systems. Imagine a future where autonomous drilling rigs are able to drill efficiently and safely, and also act as conduits to a higher logic of execution. The focus then shifts from managing tactical situations to exploiting strategies across a portfolio of assets.

  • Reservoir Management

Controlling operations to maximize both short- and long-term production requires lifecycle optimization. This optimization considers reservoir model uncertainties as well as production measurements, time-lapse seismic, and other available data. How can we use that huge amount of information in an efficient way and ensure that the reservoir models are kept current and consistent?

Pic 3In reservoir management, arbitrarily complex and multivariant models can be produced by a data-driven methodology while parametric models tend to be limited by human comprehension. Do reservoir managers depend too much on empirical observations? As the industry generates more data variety and deals with the avalanche of real-time data streaming in from intelligent well sensors, it evolves into an “Internet of Things” environment that necessitates a data-driven suite of soft computing techniques.

Still the industry pursues an abstraction of reservoir management. In response to the plethora of real-time data from sensors deployed in intelligent wells, a reservoir management solution embraces advanced technical tools as well as automated analytical workflows. Remote work practices promote collaborative centers of excellence.

The Path Forward

In each of these cases, the best practice is a hybrid solution that marries a data-driven suite of advanced analytical methodologies with the wisdom of experienced engineers to constrain modeling across the E&P. This hybrid solution is gaining rapid acceptance by the engineering silos as a means to provide the vast pool of knowledge as deductive information in the upstream models. In the future we anticipate situationally and contextually aware analytics with the ability to automatically determine the different phases of operation and adjust or alter modes for a real-time system optimization.

William Shakespeare noted in “Measure for Measure:” “Our doubts are traitors and make us lose the good we oft might win by fearing to attempt.” Where do these doubts come from, and how can we overcome them? The answers may lie in the generations discussed at the beginning of this series. As Generation X, we inherently wish to understand the “how” of any solution with our expectation of the output being linear and complete. Generation Y seems to have reversed this notion. Their focus is on the strategic use of solutions focused toward a multitude of possibilities. Let the data do the talking, and witness what the next generation is able to achieve.