Oil & Gas Product News Logo

Smarter heavy oil fields allow more dynamic interaction with data

Smarter heavy oil fields allow more dynamic interaction with data

It’s not the size of your data that matters – it’s how you use it. From social media feeds to call records and videos, data is all around us and available anytime, anywhere. But the sheer vastness of data means it may sometimes be difficult to leverage. This is especially true within the oil and gas industry as many organizations are simply unequipped to handle its volume and velocity.

Big data mining techniques used in the past, such as clustering algorithms and pattern associations, provided insights that increased production and predicted downtime of oil wells. We are now drowning in information and as we instrument more assets, we bring on even more data. Today, the dilemma most oil corporations face is analyzing all of this information and putting these gleaned insights to good use.

As one of the world’s largest producers of oil, Canada must be at the forefront to innovate the industry. The Canadian Association of Petroleum Producers predicts that the total Canadian crude oil production will increase to 6.4 million barrels per day by 2030, compared to 3.5 million barrels per day in 2013 (2014 Crude Oil Forecasts, Markets and Transportation). To meet these global demands, Canadian companies should consider improving their current operational technology capabilities and drive enterprise innovation through connected systems of insight instead of discrete and disparate innovation initiatives.

Recent advances in analytics and technologies can help supply analytic engines with massive streams of information, improving the capacity for corporations to process all possible data sets and extract all relevant insights that the data has to offer. A combination of computational science and field domain knowledge can be applied to build true, refined models of specific systems that are then optimized by determining factors like: ideal steam process flows to ensure maximum capture under controlled conditions; deciding what impacts production when modifying heating cycles; and concluding which mixtures are best when introducing certain solvents.

The petroleum industry must acquire and analyze huge streams of continuous data and then take appropriate actions as necessary.

Streamlining data

Whether used to advance oil production or predict well failures, the petroleum industry must acquire and analyze huge streams of continuous data and then take appropriate actions as necessary.

This kind of environment is ideal for utilizing streaming technology systems that work to enable real-time analytic processing for multiple terabytes of structured or unstructured data in motion. These systems allow technical applications to quickly ingest, analyze and correspond to information as it arrives from thousands of real-time sources. Streaming solutions can handle very high levels of data throughout rates of up to millions of events per second. From here, the data is accumulated, evaluated and used to detect errors, improve performances and greatly increase an oil company’s overall competitive edge.

Take for instance, the ability to predict danger to a drilling or production rig from ice flows in the ocean. Data comprised of weather, water current, satellite imagery, rig status and video details can help provide a complete picture of the field environment, including its current and future state. As these data streams are captured, decisions can be made to design certain limits for mining tools and “scoring” algorithms. Data mining algorithms analyze oil well characteristics that become stored into streaming tools. Once this data moves through the streaming tools, it is scored in real-time to enable alerts and triggers for specific events. In this case, streaming technology can help large integrated oil companies better monitor ice flows in the ocean and provide both real-time warnings and ample time to move rigs if potential danger is spotted.

The same capability is used in oilfields where streaming technology can help organizations move beyond traditional real-time monitoring to more agile, realtime prediction – bringing new functions to heavy oil, enhancing production and managing well flow systems for both water flood and steam injection. Heavy oil requires various thermal, chemical, physical and microbiological methods to help extract greater percentages from reservoirs.

These methods typically bring forth health and safety risks that require greater protocols of care.

Integrating additional data sources, like oil streams, lab reports and chemical analysis, make it even more critical to measure certain features, such as subsidence and ground movement from steam injection in real-time.

Applying analytics to the oilfield

In order to achieve optimal streaming technology results, oil companies must instrument the correct data during the decision-making process, interconnect accurate methodologies and governance to apply their results, and intelligently make sense of the output across the whole field in a useful, actionable manner.

The vision of a smarter heavy oilfield leverages technologies that integrate and holistically manage the field to maximize asset value. This begins with an instrumented oil well that can transmit data on a continuous basis from technologies like downhole surveillance systems monitoring, such as permanent downhole gauges (PDGs), and fibre optic systems which integrate a multitude of measurements. Information retrieved from oil wells can be collected and sorted by asset class, while pre-processing the flowing data to look for abnormalities and events. In addition, different data types – such as weather, location, geophysical, events, surface temperatures and pressures, well-site performance irregularities and environmental health and safety requirements – can be integrated to provide a holistic view of the oilfield production.

Here, the goal for oil companies is to make faster and better decisions and take actions accordingly as needed. In order to do this, corporations need to ensure they collect the right data sets. Integrating streaming technology allows for acquiring many more sources of complex data, both structured and unstructured, to be analyzed in motion with more sophisticated and complex logic. On top of this, streaming data can be analyzed and manipulated at extreme speeds of data in motion and enable effective exception based surveillance. Data sets will be considerably more “event rich” than the original raw data and can be further enhanced for deeper analysis.

If oil and gas organizations determined specific analytical protocols for categorizing current and future production of oil wells in a field, they would need to apply these same protocols to nearby fields within a similar environment. As such, these corporations will need to make appropriate assumptions about the acquisition, quality and integrity of the collected data.

Standardized surface, sub-surface and enterprise wide data used for analysis is essential as most oil and gas organizations apply the same techniques across the field to garner specific well and field behavioral changes reflective of the environment. Data governance is integral to ensuring standardization in data acquisition and management. Understanding how “clean” your data is becomes vital to deciding which analytical techniques can be applied.

The third step in aggregating and processing oil data sets involves using domain expertise to help design and interpret the acquisition and analysis of data. Characterizing an oil well is possible based on initial production trends that determine future production outcomes. For example, gas fields are usually found in a braided stream depositional environment. To predict which oil wells require modifications and which do not, each gas field should be treated independently from the others, versus aggregating data from all of the fields and then performing an analysis.

A new era of efficiency

With the discovery of new reserves, streamlining global operations and maximizing the yield of old and new wells, oil and gas companies are challenged with effectively managing and coordinating data, extracting insight and increasing productivity – and they are leaving both money and product on the table. Fortunately, there are tools and cutting-edge technologies capable of propelling oil and gas companies into a new era of efficiency.

Heavy oil has unique challenges that are addressed with various recovery techniques, requiring additional sources and streams of integrated data to provide a holistic view of field operations. The right technology solutions can equip corporations to quickly analyze incoming data sets and apply that information to complex analytical models, allowing them to create tactical insights that help increase drilling and production performance while preventing future problems.

Today, streaming technology can be used efficiently throughout the oil and gas industry to monitor well health, optimize oil production for both heavy oil and gas, predict failures during cyclical steam injection, and monitor conditions that would affect the normal operations of a field. Organizations need systems to handle the distinct challenges of big data and move them from a disparate, disconnected approach to an integrated effort that will drive further innovation and competitive advantage.

Alex Zekulin, Ph.D., is with the IBM Institute for Business Value. Ross Manning is a partner with IBM Canada, Global Business Solutions.

Company info

3600 Steeles Avenue East
Markham, ON
CA, L3R 9Z7

Website:
ibm.com/ca

Phone number:
905-316-2535

Read more