Explor breakthrough a game-changer for seismic data processing in the cloud
Results of collaboration with Halliburton and AWS to reduce costs and shorten timelines
Explor successfully ran Seismic Processing, a DecisionSpace 365 cloud application powered by iEnergy on AWS, leveraging a range of different elastic compute instances to optimize key seismic processing workflows. In the first phase of a proof of concept, multiple benchmarking tests were run which demonstrated:
- An 85% decrease in CDP sort order times: Tested by sorting 308 million traces comprising of 1.72 TB from shot domain to CDP domain, completing the flow in an hour.
- An 88% decrease in CDP FK Filtering times: Tested with a 57 million-trace subset of the data comprising 318 GB, completing the flow in less than 6 minutes.
- An 82% decrease in pre-stack time migration times: Tested on the full 165 million-trace dataset comprising of 922GB, completing the flow in 54 minutes.
For this project, Explor provided the 3D seismic dataset, data science and geophysical expertise. Landmark, a Halliburton business line, provided the Seismic Processing, a DecisionSpace 365 cloud application powered by iEnergy and technical expertise, and Amazon Web Services (AWS) provided the cloud computing resources and a team of technical experts and Solutions Architects.
Working together on the project, Explor, Halliburton, & AWS were able to optimize the cloud solution to reduce total processing timelines by 90 percent.
"This outcome will drive a step-change in seismic data processing and is a key stepping-stone in the digital transformation of the global seismic industry. By reducing both timelines and the CapEx burden on seismic data processors, the industry will be able to scale on demand as we acquire ever-increasing seismic trace densities," said Allan Châtenay, President of Explor.
The challenges driving this project were the need to process ever-increasing sizes of seismic data while delivering higher quality results at lower cost. In recent years, the seismic industry has dramatically increased 3D seismic trace densities, with several companies (including Explor) breaking over the 100 million trace/km2 threshold. Surveys exceeding a billion traces/km2 are now being planned. This exponential growth in acquired seismic trace densities poses new challenges for seismic data processing as processors must deal with data volumes of several hundred terabytes for small or medium sized surveys, with large high-density surveys producing petabytes of data each month. Difficult and volatile market conditions make capital investments in high performance computing infrastructure challenging and risky for processing companies.
"Explor is committed to working with industry partners to create an ecosystem that can support ultra-high-density acquisition," said Châtenay. "Seismic data processors will access on-demand cloud computing to support surges in demand, helping processing companies reduce cost and timelines in these very challenging times. We look forward to working with a range of industry partners in support of this effort."
Halliburton's Seismic Processing, a DecisionSpace 365 cloud application powered by iEnergy, built on proven SeisSpace software technology is one of the leaders in the industry, delivering a flexible, extensible, user-friendly seismic processing application/foundation. This project delivered the first scalable cloud-deployed seismic processing engine as part of Halliburton's strategy to deliver lowest TCO in the age of digital transformation.
"The collaboration with AWS and Explor demonstrates the power of digital investments that Halliburton is making, in this instance to bring high-density surveys to market faster and more economically than ever before. By working with industry thought leaders like Explor and AWS, we have been able to demonstrate that digital transformation can deliver step-change improvements in the seismic processing market," said Philip Norlund, Geophysics Domain Manager, Halliburton, Landmark.
To achieve the desired scale and reduce the seismic processing timelines by 90%, different steps of the project required access to a variety of compute instances (total of 38K cores) and high throughput storage, which the teams were able to provision on-demand as needed. AWS provides the most elastic and scalable cloud infrastructure to run these types of HPC applications. With virtually unlimited capacity, engineers, researchers, and HPC system owners can innovate beyond the limitations of on-premises HPC infrastructure.
"This industry transforming project with Explor and Halliburton is a great example of the value AWS's comprehensive set of cloud services brings. HPC on AWS removes long wait times and lost productivity often associated with fixed capacity on-premises," said Barry Bolding, Director HPC, Amazon Web Services, Inc. "Flexible compute, storage, and networking configurations allow users to grow or shrink infrastructure environments as needed based on business requirements. Additionally, with access to cloud-based services like Data Analytics, and Machine Learning (ML), users can redefine traditional HPC workflows and innovate faster." -
According to Explor, a diverse 20-person, multi-disciplinary team completed this project during the COVID-19 pandemic, with team members working mostly from their homes in three different cities in two different countries, further proving the value of cloud computing to minimize risk, drive innovation, support collaboration and reduce costs.