This paper presents a case study using optimization technology to improve the reliability of reservoir simulation models. Global optimization techniques have been applied to assist the history matching (HM) performed. Evolutionary Algorithms and deterministic optimization schemes are integrated into a workflow controlling a large number of parallel reservoir simulations. Results are analyzed and compared to traditional HM to identify the potential added value and increased efficiency. Focus is given to the reservoir engineer who is in full control over the optimization process and interacts frequently as he/she acquires more information and improved understanding of subsurface uncertainties. The case study demonstrates the benefit and added value of utilizing a computer assisted HM workflow. Pitfalls of using a traditional workflow are illustrated. The paper focuses on a gas condensate reservoir where the most important well had proven to be difficult to match using a traditional manual HM approach. The sensitivities of a large number of input parameters were investigated, and an optimization scheme was set up in order to generate alternative HM realizations of the simulation model. Alternative models were generated using a wide variety of input parameter combinations. The entire study was conducted over a two week period, where approximately 7000 full field reservoir simulations were run. A large number of simulation models were matched within the uncertainty of the observed data, and these acceptable matches gave a wide range of recoverable reserves from the field. The first match that was found proved to be outside of the initial uncertainty range, clearly verifying the need for more than one HM model to cover the uncertainty found in most producing reservoirs. This illustrates the non-uniqueness of simulation models and the danger of relying on one deterministic model. Introduction During the last few years the reservoir engineers have fallen dramatically behind the rest of the sub-surface community. While significant resources have been spent on developing efficient tools and workflows for generating alternative geological realizations, the production forecasting workflow has not improved accordingly. As the work performed by the reservoir engineer most often is limited by the time frame available, the traditional manual workflow used is simply too slow to utilize the information gathered by the other disciplines. Relief is fortunately on the way as a new era is rising within the discipline of reservoir engineering. The improvements are driven by the new computational power available for simulation studies. The hardware cost has been dramatically reduced the last few years, and will continue to do so. LINUX clusters provide the opportunity to run parallel and distributed reservoir simulations at affordable prices. A new generation of reservoir simulators with improved capacity for parallel processing and improved solvers is also being brought to the market. Furthermore, recent advances in multi-core processors will change the way engineers work with computers. Multi-core processoring is the future of computing1–3, which will enable engineers to run a large number of simulations on their desktop PC or laptop. The evolution of multi-core processoring is a technical and commercial challenge for simulation software vendors regarding licensing issues and pricing policy. These technological advances provide new opportunities for use of computer assisted history matching. While the automatic HM tools developed during the 1990's and earlier where employing simple gradient based optimization methods and running the reservoir simulations sequentially, more robust optimization methods are now available. They allow the engineer to launch simulations in parallel and using global optimization techniques4–10.
fax 01-972-952-9435. AbstractLarge simulation models with excessive simulation time are traditionally a challenge for the reservoir engineer. This paper introduces a new concept of Event Targeting Model Calibration used for History Matching and Uncertainty Quantification in Reservoir Simulation. It is shown that the history matching process of large reservoir simulation models can be significantly improved by coupling different experimental design, optimization and analysis techniques.Assisted history matching techniques using stochastic and direct search methods have already proven to outperform manual workflows on small and moderate size simulation models. The application of the same techniques and methodology when confronted with the largest simulation models (defined by long simulation times) have proven challenging. A more sophisticated use of the assisted history matching tool box is necessary to utilize CPUs for multiple simulation models as efficient as possible even if distributed computing capabilities are available.The paper describes a workflow employing assisted history matching techniques to handle 'monster' simulation models. An Event Targeting Model Calibration work process is introduced which focuses on key historical events and divides the production history into main time periods. The optimization algorithms usually used in assisted history matching studies are replaced by experimental design methods to investigate the different time periods. Analysis techniques like a newly implemented cluster analysis is used to identify alternative history matched models in a multi-objective optimization formulation. Optimization algorithms are finally used for fine-tuning purposes. In this framework, a remarkable improvement of both the history matching process and uncertainty quantification is possible. This is a significant break through, improving the capability of understanding reservoir uncertainties for large field developments. This paper summarizes experiences from several different complex history matching studies and outlines guidelines to apply state-of-the-art optimization techniques in combination with experimental design methods to the problem of History Matching and Uncertainty Quantification of large simulation cases.
An increasing number of field development projects include rigorous uncertainty quantification workflows based on parameterized subsurface uncertainties. Model calibration workflows for reservoir simulation models including historical production data, also called history matching, deliver non-unique solutions and remain technically challenging. The objective of this work is to present a manageable workflow design with well-defined project workflow tasks for reproducible result presentation. Data analysis techniques are applied to explore the information content of multiple-realization workflow designs for decision support. Experimental design, sampling and Markov Chain Monte Carlo (MCMC) techniques are applied for case generation. Data analytics is applied to identify patterns in data sets supporting the evaluation of the history matching process. Visualization techniques are used to present dependencies between contributions to the history matching error metric. Conflicting history matching responses are identified and add value to the interpretation of history matching results. Probability maps are calculated on the basis of multiple-realizations sampled from a posterior distribution to investigate potentially under-developed reservoir regions. Technologies are applied to a real gas field in the Southern North Sea. For the purpose of the benchmark, a structured workflow design to history matching and estimation of prediction uncertainty is presented. Sensitivity evaluations are used to identify key uncertain input parameters and perform parameter reduction. Markov Chain Monte Carlo (MCMC) is applied for optimization and uncertainty quantification. Statistical stability of key performance parameters is verified by repeating relevant phases of the workflow several times. In conclusion practical consequences and best practices as well as the use of data analytics in history matching workflows are discussed.
TX 75083-3836, U.S.A., fax 01-972-952-9435. AbstractThis paper presents a case study using optimization technology to improve the reliability of reservoir simulation models. Global optimization techniques have been applied to assist the history matching (HM) performed. Evolutionary Algorithms and deterministic optimization schemes are integrated into a workflow controlling a large number of parallel reservoir simulations. Results are analyzed and compared to traditional HM to identify the potential added value and increased efficiency. Focus is given to the reservoir engineer who is in full control over the optimization process and interacts frequently as he/she acquires more information and improved understanding of subsurface uncertainties.
fax 01-972-952-9435. AbstractLarge simulation models with excessive simulation time are traditionally a challenge for the reservoir engineer. This paper introduces a new concept of Event Targeting Model Calibration used for History Matching and Uncertainty Quantification in Reservoir Simulation. It is shown that the history matching process of large reservoir simulation models can be significantly improved by coupling different experimental design, optimization and analysis techniques.Assisted history matching techniques using stochastic and direct search methods have already proven to outperform manual workflows on small and moderate size simulation models. The application of the same techniques and methodology when confronted with the largest simulation models (defined by long simulation times) have proven challenging. A more sophisticated use of the assisted history matching tool box is necessary to utilize CPUs for multiple simulation models as efficient as possible even if distributed computing capabilities are available.The paper describes a workflow employing assisted history matching techniques to handle 'monster' simulation models. An Event Targeting Model Calibration work process is introduced which focuses on key historical events and divides the production history into main time periods. The optimization algorithms usually used in assisted history matching studies are replaced by experimental design methods to investigate the different time periods. Analysis techniques like a newly implemented cluster analysis is used to identify alternative history matched models in a multi-objective optimization formulation. Optimization algorithms are finally used for fine-tuning purposes. In this framework, a remarkable improvement of both the history matching process and uncertainty quantification is possible. This is a significant break through, improving the capability of understanding reservoir uncertainties for large field developments. This paper summarizes experiences from several different complex history matching studies and outlines guidelines to apply state-of-the-art optimization techniques in combination with experimental design methods to the problem of History Matching and Uncertainty Quantification of large simulation cases.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.