With the dearth of easy oil in the industry, the importance of consistency in quantifying uncertainties and assessing their impact on investment decisions have become very crucial in management decisions. This has seen the stocks of both experimental design and response surface techniques in the E&P industry rise significantly as an alternative to the more traditional uncertainty analysis. Whilst there are papers describing experimental design workflows and the different methods of generating response surface models for reservoir simulation studies, there is also a growing need to share practical examples of the lessons learned in constructing experimental designs and using response surface models to interrogate the experimental design outcomes. After extensively applying these concepts for over 18 months in identifying the major sub-surface uncertainties, explaining observed production performance and in prescribing additional development options for fifteen reservoirs situated in four different fields that are at different stages of maturation, it has been possible to capture many useful lessons. These lessons will both strengthen the benefits and ease the pains of applying these concepts in reservoir simulation studies. Introduction The ability of a simulation model to satisfactorily explain the past reservoir performance underpins its reliability to predict the future reservoir performance. Unfortunately, simulation models are not unique and this undermines the credibility of forecasted results. To mitigate this impact on business decisions, the reservoir engineer conducts several equally probable simulations to capture the range of uncertainties. Left unchecked the number of simulation runs can easily become unmanageable. Experimental design, ED, and the associated response surface methodologies, RSM, offer a cost-effective and efficient way to assess the impact of uncertainties on business decisions. The method also helps to identify the major parameters that have the most influencing impact on the business decision. Since the first introduction of experimental design to the oil industry in the early 90's (Damsleth, Egeland, Larsen) reservoir engineers have developed and successfully applied several experimental design workflows to various reservoir engineering studies (Friedmann, White, Amudo, Graf, Salhi, …). A typical workflow features the following steps:Uncertainty FramingScreening parametersConstraining uncertainty rangesRisk Analysis Following a brief introduction of the reservoirs, the outline of this paper mirrors the above typical steps in an experimental design workflow. The paper presents the lessons and experiences distilled from the application of ED and RSM concepts to 15 detailed reservoir studies and at different points in their project maturity spectrum. Though this paper does not purport to have all the answers, it attempts to address the painful challenges and highlight the benefits of using the technology.
Petroleum Experts' Integrated Production Model (IPM) suite of software is widely used in the E&P industry especially for project evaluations that require integration of both surface and subsurface models. There is evidence in the literature to show diverse applications in field development planning, integrated forecasting, surveillance and production system optimization. Perhaps less reported are the lessons learned and best practices in using the IPM software. This paper focuses on these issues using Chevron's IPM model for some of its largest gas fields.The Non-Operated Joint Venture (NOJV) Subsurface Team began developing an IPM model for one of its biggest gas assets in 2005. With explicit modeling of critical components like compressors, dozens of wells and reservoir tanks, platforms, fluid characterisation, gas-water contact movement, pipelines, sub-sea manifolds and separators, this is arguably the largest and most complex IPM model in Chevron. The model continues to play a critical role in Chevron's effective capital stewardship of the gas asset. The need to maintain the credibility of this model cannot be over-emphasized, and the model has undergone several phases of enhancement to ensure that it continues to meet business objectives.This paper describes some of the best practices and lessons learned in constructing and maintaining a complex IPM model. It is intended as a resource for IPM practitioners. Examples cover all aspects of the IPM from the non-technical (e.g. framing the problem, case definition and naming convention) to the technical (e.g. model construction, model maintenance, software limitations, constraint violations, production optimisation and quality assurance checks). IntroductionProduction forecasting involves attaching a timescale to production recovery and it is one of the most vital roles of reservoir engineering. It underpins the cashflow of any project and can make the difference between a project being sanctioned or abandoned. The complexity of the role is underscored by the requirement to integrate multiple and diverse disciplines including subsurface characterisation, surface network configuration, production philosophy, economic limits, business decisions and operational constraints.Unlike production forecasting for oil fields, gas forecasting is further complicated by long-term contracts and the need to meet contractual obligations. This requirement means that gas companies need to correctly predict the execution of future projects to ensure that they have enough gas to satisfy their contractual oligations. Usually, in gas forecasting, multiple fields with diverse fluid properties are produced simultaneously and this further introduces the complication of gas quality whilst also maximizing the value of by-products like condensate and natural gas liquids. These complexities indicate that an integrated gas forecasting model is required to accurately predict production for a gas field. There are many such products available including company proprietary software for internal us...
With the increasing acceptance of stochastic workflows in mainstream reservoir engineering studies, many frameworks have been developed to assist in the history match of reservoir models. This paper describes the application of experimental design and response surface methods, not only in conditioning complex reservoir models to the historical production data but also in refining the reservoir models to improve the overall history match results. The reservoirs are in the Niger Delta and consist of faulted layers from both the Benin and Agbada formations. The reservoir models envelop all major reservoir uncertainties ranging from static parameters such as structure and porosity to dynamic parameters such as aquifer strength, relative permeability, and even production records. The experimental design combined all the subsurface uncertainties in different realizations and ensembles to construct response surface models capturing the multiple responses of the simulated historical performance. These response surface models serve three main purposes: identification of the "heavy hitters," improving the reservoir model, and facilitating the stochastic history match. The history-matched ensemble successfully explained the reservoir and drainage point production performance; identified uncertainties that have the most significant impact on the historical performance and development; established the most likely original water contact for one of the reservoir compartments; explained the connectivity between the different fault blocks; and formed the basis for risk mitigation analysis of further development in the reservoir. Introduction The usefulness of a model in supporting future development activities in a reservoir depends largely on how well the model is able to explain past reservoir production performance. This process, known as history match, involves conditioning a reservoir model to the historical production data. However, history match is not only a difficult problem; it is a non-unique and generally time-consuming inverse problem to solve. This non-uniqueness results in several combinations of model parameters that can adequately explain past reservoir performance. Though these models may satisfactorily explain past performance, they often produce divergent outcomes when used for predicting the future performance of the reservoir. This range of outcomes relates directly to the uncertainty associated with any development option and forms a critical input in business decisions. It is, therefore, desirable to have a method that both capture the widest possible combination of model parameters that explains historical production data and is quick to update. Considering the time intensive nature of history match and its other limitations, the traditional deterministic approach that relies on a trial-and-error method may be inappropriate in meeting these objectives. On the other hand, the process of stochastic history matching is different from conventional history matching and is more suited to handling uncertainties consistently. It involves creating a response surface model by fitting the outcomes of an experimental design to an equation containing the most influential parameters.
The production profile of an unconventional resource play typically has a very steep decline after attaining the peak production rate. Consequently, operators are focused increasingly on drilling the wells faster and getting them on production quicker in order to improve early cash flows. This can sometimes come at the expense of gathering potentially useful data that may help improve reservoir characterization. This practice raises a couple of vital questions that traditional methods based on either deterministic or one-variable-at-a-time (OVAT) methods struggle to answer coherently. Examples of such questions include: How much impact does drilling and completion speed have on the overall project economic measures?What is the relative significance of all the key factors affecting the project economic measures?Is it worth taking the time to acquire data that can ultimately reduce the uncertainty in the reservoir characterization and production performance?What are the approximate models for the response variables? A new workflow based on experimental design concepts has been developed to answer the above questions and tested using an unconventional shale oil resource. In this case study, we used the D-Optimal design table and evaluated the impact of a number of diverse factors ranging from speed of drilling, put on production (POP) time, production ramp-up, expenditure, product price to production type curve on the project economic measures. The results, for example, show that while a reduction in drilling and completion times may affect early production metrics, some other factors like production type curve have much more impact on the project's net present value (NPV) and Discounted Profitability Index (DPI).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.