As the pressure mounts to cut costs upstream, ExxonMobil expects to save a lot of time and money by using brute force in the computing department.
Working with the National Center for Supercomputing Applications (NCSA), the US major said February 16 that it has “achieved a major breakthrough with proprietary software using more than four times the previous number of processors used on complex oil and gas reservoir simulation models to improve exploration and production results.”
As a consequence, “ExxonMobil geoscientists and engineers can now make better investment decisions by more efficiently predicting reservoir performance under geological uncertainty to assess a higher volume of alternative development plans in less time.”
This enables ExxonMobil’s geoscientists and engineers to make better, faster decisions on how to develop and manage oil and gas reservoirs, it said, adding: “The record run resulted in data output thousands of times faster than typical oil and gas industry reservoir simulation. It was the largest number of processor counts reported by the oil and gas industry, and one of the largest simulations reported by industry in engineering disciplines such as aerospace and manufacturing.”
To model complex processes accurately for the flow of oil, water, and natural gas in the reservoir, simulation software must solve a number of complex equations. Current reservoir management practices in the oil and gas industry are often hampered by the slow speed of reservoir simulation.
In July 2014, Eni turned on its upgraded high-performance supercomputer (HPC), which enabled it to support exploration and reservoir mapping activities more effectively. It described the HPC as the largest in Europe used in the oil & gas industry, and one of the oil industry’s largest worldwide. Repsol too has made much of its use of supercomputers in Barcelona and Houston in upstream planning and reservoir mapping.(Source: William Powell at Natural Gas World)