Predicting uncertainty in oil & gas comes easy for internationally recognised reservoir engineer, mathematician and programmer, Dr Andrew Wadsley, a builder of simulators and data systems. Learn what it takes to predict the uncertain. Read on.
Reservoir engineering has always been a complex task, simultaneously helped by and burdened with incredible amounts of data. Even the self-styled ‘simulation experts’ in the industry aren’t taking full advantage of the data they currently have.
Other reservoir engineers though, aren’t so intimidated by the task of integrating increasing amounts of information into their systems but also understand the principle of lex parsimoniae, the law of parsimony aka Occam’s Razor.
Enter Dr Andrew Wadsley, an award-winning mathematician and programmer extraordinaire who, for 40 years, has created his own simulators from scratch that are used by some of the world’s most powerful oil and gas operators. In addition, he has his own theorem, the A.W. Wadsley Theorem in differential topology, is winner of the Australian National University Medal in Mathematics, been an Umpire in gas reserves disputes and has authored numerous technical papers including a recent paper on ethylene polymerization co-authored with his son-in-law.
Following more than a decade working for major industry names around the world, such as Shell International, Dr Wadsley progressed in his career from a well-site petroleum engineer to a specialist in reservoir modelling and field development planning, developing his own unique simulation suite of integrated Reservoir-to-Market planning and production software tools, and giving himself a distinct competitive advantage in the marketplace. In his own words, Andrew says, “In the first 5 years after I returned home to Tasmania, I wrote more reservoir engineering software than Shell did in the same period. Being a long way from the oil industry centre meant I had to become self-reliant.”
Andrew’s accuracy at predicting uncertainty using novel Monte Carlo techniques was so successful that he decided to start his own independent company, Stochastic Simulation, a provider of his exclusively patented simulator programs, and has been engaged by leading petroleum enterprises ever since.
His speciality, reservoir management and production optimisation, has seen him engaged on numerous strategic projects along with invitations to speak at Universities and industry functions around the globe.
Being one of the first to pioneer integrated production planning and reservoir simulation in its infancy, Dr Wadsley has seen it all: how the industry, its attitudes and the processes have changed.
Since his last interview to the Hobart Mercury in the late 90’s, Dr Wadsley spoke then about how reservoir engineers need more data to drive their simulations and how companies need to adopt new simulator processes to compete in a volatile market.
Today he breaks his silence on the current industry climate, what really grinds his gears, career highlights, advice for jobseekers and more importantly, what the future entails for reservoir simulation.
Q. Hi Andrew. Describe your typical work day.
That really depends whether it’s winter or summer. In winter, after stoking the wood boiler which heats the house, I usually go out and carry some hay or grain to my cattle. Often this is a chilly introduction to the day.
Then I get a couple of hours of email-free work done before the Perth office wakes up two hour later. Typically, the day then consists of video conferencing with Perth or other clients around the world, interposed with coding, working through the emails, or streamlining the workflows.
Q. What are you spending your energy on at the moment?
Back in the 1990s I integrated the Gippsland Basin aquifer simulation model with my gas planning software. That was exciting work as it attached gas-water tank reservoirs described by volume-depth relations with water influx from the finite-difference, single-phase water model of the large regional aquifer. Before then, in the 1980s, I had worked with Shell’s Gasso simulator which integrated a surface pipeline network with a single phase gas reservoir.
Now I am finalising the integration of the ResAssure simulator with the GasAssure network and planning model. Again this is exciting as it involves full probabilistic modelling of sub-surface uncertainties with network reliability. Even after 30 years there is still more work to be done in this area.
Of course, there’s always work to be done on the farm. Over the past three weeks we have had very strong winds which have dropped many trees – mostly wattles which grew up after the 1967 bush-fires in Tasmania and are now dying of old age. I’m now out with the chain saw chopping these up for firewood and mending the fences which have been crushed.
Q. How did you end up working in the oil and gas sector?
I finished my PhD in Mathematics in 1974 and looked for a job. I was offered a post-doc in Paris but, with two young children, my wife and I were a little tired of being honest but poor students. Shell had been advertising for PhDs to join their Rijswijk research lab in the Netherlands so I trimmed my hair back from mid-waist to shoulder length, was interviewed in the Hague and subsequently offered a job.
Q. Why did you want to become a reservoir engineer?
This is one of those mysteries that dramatically changed my life. Having been offered a job, I arrived in the Netherlands to start work on Monday, January 5, 1975 and was immediately whisked away to the Shell training centre in Wassenaar. Two months later I found I was on the well-site petroleum engineer’s training course, and not destined for the lab. It turned out that January 5 was the start date of the well-site PE course, and what was mere coincidence became somewhat more.
From there I went to Brunei and worked offshore as a well-site PE and later moved to the Hague with Shell International in their computer applications group. During the training course I was taught reservoir engineering by Laurie Dake who made a real impression on me, so perhaps because of Laurie I became a reservoir engineer. Ten years later, I had the privilege of working with him in a consulting group in London. In one of life’s ironies, in the 1990s as an expert witness I was asked whether Laurie Dake had applied the principles of material balance correctly – here I was, the student, declaiming on one of the greatest classical reservoir engineers ever!
Coincidentally, a couple of years ago some 35 years after working in Brunei I was again on site in Brunei supervising a well test. Still had my original hard-hat: seems some things never change.
Q. What do you believe are the biggest challenges currently facing the O&G industry?
Frankly, reservoir modelling is in crisis. Senior management in oil and gas companies do not believe the results of reservoir modelling and simulation, particularly for complex reservoirs just when good modelling outcomes are vitally needed. Part of this is their own fault as Boards and decision makers only want to see one number, despite notionally acknowledging P90, P50 and P10 reporting requirements for reserves. While lip-service is paid to uncertainty, most Boards and financiers find it very hard to deal with.
But the disbelief in reservoir modelling has deeper roots that this. It arises because the results of reservoir modelling are often very poor indeed. As reservoir complexity increases predicted recoveries can be more than 100% different from actual reservoir performance, usually in the downward direction. This has been documented by many studies of North Sea fields, particularly that of Droomgoole and Speers in 2008.
And the response of the profession?
Reservoir models become larger and larger, as though size alone can compensate for inadequately calibrated data and understanding of real geologic uncertainty. In particular, geostatistical extrapolation of rock properties goes way beyond core and log data controls and has veered into fantasy and speculation.
I recently saw a model where over 100 different geostatistical realisations were assumed to cover the range of uncertainty, where not one of these realisations could be validated as approximating the true distribution of permeability and porosity, and where major uncertainties such as structure outside of well control, fault seal and compartmentalisation had been ignored. Another model extrapolated porosity some 8% outside the range of core data to unphysically high values. Another had simulation cells physically intersecting in the model. Yet another had 250 very thin reservoir layers yet the average areal cell size was 200m x 200m – in this case, truncation errors alone destroy any validity of the dynamic model results. These models were being used to plan multi-billion dollar field developments – all were unfit for purpose!
Reservoir simulation is not like computational fluid dynamics (CFD) used to design modern jet aircraft like the Airbus 380. Most CFD models are calibrated through detailed wind-tunnel experiments. We cannot do this in reservoir simulation as we do not drill enough wells (to calibrate the geology) and the results of the ‘experiment’ are only known at the end of field life.
This inability to correctly quantify future reservoir performance is having a major impact on project finance. By ‘correct’ I mean defining a proper range of outcomes from say, the P97 to the P03. In the low oil price environment we now find ourselves, unless financiers and Boards believe the outcomes of reservoir modelling, then new projects which should have gone ahead will be shelved and producing fields will be abandoned prematurely.
Q. How has the industry changed during your career?
Computers have got faster, simulation models have grown at the same rate, geophysical processing has progressed by leaps and bounds, but interpretation of processed seismic is still, largely hand-crafted.
In a recent talk to the local SPE Chapter in Bangkok, Thailand, I showed a graph plotting peak computer speed versus Saudi Aramco simulation model size. In 1975 when I joined the industry, maximum model size was about 10,000 (10^4) cells and peak speed was about 100 million (10^8) flops. Now maximum model size is quoted as 1 trillion (10^12) cells, and peak computer speed is 10 quadrillion (10^16) flops. The ratio of speed to model size has remained static at 10,000 flops per cell!
All of the staggering increase in peak computing power (for a large parallel cluster, by a factor of 100 million) has been wasted on modelling ever larger models. Why do I say wasted? Because computers held the promise of faster and better history-matching and the ability to quickly screen multiple field development scenarios to achieve near optimum outcomes (in the presence of uncertainty). We weren’t doing this 40 years ago, and with the same speed:size ratios, we aren’t doing it today.
Laurie Dake OBE died in 1999 in while visiting Perth, Western Australia, but his views on reservoir simulation are as fresh today as when he first wrote them over 20 years ago: “There is no such thing [as a Simulation Engineer], only reservoir engineers who happen to have simulation packages at their disposal for use, amongst other tools, as and when required.”
He also went on to say: “If mathematics is used carefully and correctly then we should have a great advantage over our predecessors in this subject but if it is abused by relying on mathematics to define physics, then reservoir engineering itself is in danger.”
With the emphasis on ever increasing model size, we reservoir engineers have thrown away the advantage that modern computing technology should have given us, and reservoir engineering is in danger of becoming irrelevant to financial and field development decision making.
Q. How can reservoir simulation be improved?
Increasing the number of cells in a simulation model does not improve the ability of the model to predict future performance. We should use Occam’s Razor which states that among competing hypotheses, the one with the fewest assumptions should be selected. This applies both to new and mature field development.
Occam’s Razor tells us that the model with the fewest parameters which matches the observations is the best predictor of future performance. This is easily seen when carrying out regression. With data scattered about a line, linear regression with only two parameters is a better predictor than a polynomial which fits all point exactly.
One of the best approaches is BP’s Top Down Reservoir Modelling (TDRM) which is to start with the simplest possible model and add detail as required. As their 2004 paper says, “The approach overcomes the problems of the conventional ‘bottom-up’ process, which uses detailed models that are too slow and cumbersome to fully explore uncertainty and identify critical issues. Highly detailed models cannot overcome an underlying absence of information, and can have the negative effect of creating a false sense of understanding.”
The “effect of creating a false sense of understanding” has led directly to the current crisis in simulation and reservoir modelling.
Every order of magnitude increase in model size does not lead to an order of magnitude increase in confidence in the model results – on the contrary, it leads inevitably to a decrease in confidence and a decrease in the quality of the model outcomes.
This has been exacerbated by poor history-matching work-flows which fail to identify multiple, distinct parameter sets which give the same match, and poor pre-development workflows which using a bottom-up approach fail to identify alternative reservoir development scenarios and models.
Moreover, we fail to recognise when simulation is no longer of any use, particular at the end of field life. At that time, when production is in stripper mode, oil-water ratios are driven by subtle impacts of relative permeability, structure and layering in the reservoir. At this time, decline curve or fractional flow analysis is a far more effective tool to forecast future production than any ‘history-matched’ reservoir model. Exciting developments in this area use AI techniques to mine existing field production to predict infill drilling and optimise field production, without the use of traditional reservoir simulation models.
Q. What opportunities does the oil price crash provide the industry?
It’s really hard on the people who have lost their jobs and the companies which have shut down. Looking forward, however, we have a real opportunity to reduce cost and drive competitiveness with alternative energy sources. New technologies such as direct reduction of methane to hydrogen suggest that hydrocarbons have a long secure future even as countries strive to reduce their carbon footprint.
I see us having to work smarter and quicker – that is, long study and history-match times are nolonger acceptable and new work-flows, such as those we are developing in Stochastic, will mean faster, more timely decision making and quicker response to other changes in the industry such as FLNG, gas management and portfolio optimization, and environmental practices.
Q. What would you consider the biggest achievement throughout your career?
That’s a difficult one. Looking back over 40 years gives a somewhat different perspective to what I would have imagined when I just started. Then it was writing innovative software, getting my first reservoir simulator to commerciality, completing a really good field development plant. Now, when I look at the work the engineers do in our office, from training videos to country gas master plans, I’m really proud to have mentored them – they are the future of the industry. I get a real kick out of their making me obsolete.
Q. How does your job challenge you?
The biggest challenge is to remain relevant. It’s easy to be an old fart, but what is more difficult is to promote and present my views and insights to the modern young engineers in a way which makes them think outside the box of Reservoir Engineering 101 that they learnt at university. I got a real boost when one of our engineers, Farnoosh, said she had left her job with one of the major oil field services companies to work with me in Stochastic, possibly something to do with my teaching her reservoir simulation in the Masters programme at Curtin University. My challenge is to make all of our engineers and programmers feel the same way about working in the company and with me.
Q. Where has your career taken you, internationally?
Of course, Tasmania is the centre of the oil and gas industry – to within Galactic tolerance! Seriously, I’ve always had to travel gaining my Honours degree from the Australian National University in Canberra, the Australian Capital Territory and my PhD from the University of Warwick in Coventry, England. Since then I have worked in the Netherlands, Brunei, England, Scotland, Norway, New Zealand, Malaysia, Indonesia and several other countries.
Q. What impact do you see Cloud Computing having on the O&G industry?
Cloud computing is literally clearing the desk top, putting the requirement for powerful computing to the background and allowing engineers to get on with their job without worrying about computer resources. Currently, high-speed computing is a commodity which is very cheap.
Only our profession’s infatuation with very large simulation models is preventing a quantum leap in understanding reservoir development and uncertainty quantification.
Integrated models that took 10 hours to run on a VAX 11/780 in the 1980s are still taking 10 hours to run on a multi-core processor in 2015, even though cloud-based computer power has increased by more than a factor of 10000 and is 1/1000th of the cost. The challenge is to unleash this power to produce more effective modelling.
My approach is to apply TDRM using pebi-grids to accurately model reservoir structure whilst simultaneously reducing the number of active cells. To this end, we have developed a Polygonizer which creates pebi-grid models directly from Cartesian grids. Recently we took a standard, industry corner-point geometry model and reduced the model size from 1.7 million active cells to 110,000 cells using the Polygonizer, achieving runtimes of less that 25s running ResAssure on a 4-core cloud processor. This speed advantage greatly assisted our achieving a great history match for the 60 wells in the field.
Q. Your software programs are used worldwide by major Oil & Gas operators. What career advice do you give to others who want to follow in a similar career path to yours?
Perhaps the best advice comes from “Thomas the Tank Engine” who is a “really useful engine”. Any software must pass the “useful” test. And the best test for usefulness is to have a client who needs it and is willing to support development of it.
I was lucky to have Norsk Hydro in Norway help assist development of ResNet (now OilAssure), Fletcher Challenge Petroleum in New Zealand assisted development of the pebi-grid simulator Ressim (now evolved into ResAssure) and Esso Australia assisted development of Gasplan (now GasAssure). Without these companies’ support, I would have been coding based on whim and intuition – flying free, so it were – but not necessarily effective or relevant to today’s problems. Don’t fall into the trap of creating a solution and then looking for the problem it solves.
So, talk to your industry colleagues, get involved with the SPE or other industry organisations. Prototype, prototype again, and prototype once more, until you can demonstrate you have a really useful solution to a real problem.
Thank you Dr Wadsley for providing us with valuable insight into your profession and your thoughts of the industry. If you wish to contact Dr Wadsley, you may do so via his LinkedIn profile.
Learn more about Dr Wadsley’s Reservoir Analytics & Integrated Asset Modelling Platform
Fully integrated and scalable technology that turns complex problems into easy-to-configure solutions. Get to know the new, next generation technology of the Oil & Gas upstream industry, coded from scratch by Dr Andrew Wadlsey and his team at Stochastic Simulation.
For further reading, visit https://stochasticsimulation.com to learn more or follow some of the links below:
- Reservoir Simulation and History Matching Solutions Article.
- SPE Lecture with Dr Wadsley: “Using Integrated Asset Modelling to Improve Oil & Gas Planning” (Youtube Video)
- SPE Bangkok Lecture with Dr Wadsley: “Next Generation Oil & Gas Simulators” (Youtube Video)
- ResAssure Technical Datasheet (PDF)
- History Matching Brugge Field Case Study (PDF)
- GasAssure Technical Datasheet (PDF)
- View list of White Papers by Dr Wadsley here.