Search citation statements
Paper Sections
Citation Types
Year Published
Publication Types
Relationship
Authors
Journals
One of the main concern in the O&G business is generating reliable production profile forecasts. Such profiles are the cornerstones of optimal technico-economical management decisions. A workflow combining different methodologies to integrate and reduce most of the subsurface uncertainties using multiple history matched models (explaining the past) to infer reasonably reliable production forecasts is proposed. Using experimental design theory, a sensitivity study is first performed to scan the whole range of static and dynamic uncertain parameters using a proxy-model of the fluid flow simulator. Only the most sensitive ones with respect to an objective function (quantifying the mismatch between the simulation results and the observations) are retained for subsequent steps. Assisted History Matching tools are then used to get multiple History matched models, an order of magnitude faster than traditional History Matching processes. Updated uncertain parameters (selected from the sensitivity studies) may be picked anywhere in the direct problem building workflow. Using the Bayesian framework, a posterior distribution of the most sensitive parameters are derived from the a priori distributions and a non-linear proxy model of the likelihood function. The later is computed using experimental design, kriging and dynamic training techniques. Multiple History Matched models together with a posteriori parameter distributions are finally used in a joint modeling approach to capture the main uncertainties and to obtain typical (P10-P90) probabilistic production profiles. This workflow has been applied to a gas storage real case submitted to significant seasonal pressure variations. Probabilistic operational pressure profiles for a given period can then be compared to the actual gas storage dynamic behaviour to assess the added value of the proposed workflow. Introduction Getting probabilistic production forecasts of a reservoir through a risk analysis is closely linked to uncertainty quantification 1. Uncertainty quantification should end with a posteriori uncertain parameter distributions, reflecting all the knowledge that we have on the reservoir and explaining as much as possible the observation data. The uncertainties may result both from the observation data quality as well as from the numerical modeling steps. On top of that, the fluid flows equations we are dealing with are non linear, which leads to possible complex hydrodynamical behaviour. CPU constraints make inefficient brute force Monte Carlo sampling of the a posteriori distributions from a priori ones. Most of the current approaches try to scan more efficiently (in a given timeframe) the uncertain parameters space using reliable proxy models 2 of the fluid flow simulator and/or to take advantage of new computational power 3,4,5,6. Due to the numerous uncertain parameters involved in the history matching process, a screening of the parameters space is first performed (i.e. a sensitivity study) to retain only the most sensitive ones with respect to the history matching criteria, possibly reducing the variation intervals. This is a pre-processing step of the history matching job.
One of the main concern in the O&G business is generating reliable production profile forecasts. Such profiles are the cornerstones of optimal technico-economical management decisions. A workflow combining different methodologies to integrate and reduce most of the subsurface uncertainties using multiple history matched models (explaining the past) to infer reasonably reliable production forecasts is proposed. Using experimental design theory, a sensitivity study is first performed to scan the whole range of static and dynamic uncertain parameters using a proxy-model of the fluid flow simulator. Only the most sensitive ones with respect to an objective function (quantifying the mismatch between the simulation results and the observations) are retained for subsequent steps. Assisted History Matching tools are then used to get multiple History matched models, an order of magnitude faster than traditional History Matching processes. Updated uncertain parameters (selected from the sensitivity studies) may be picked anywhere in the direct problem building workflow. Using the Bayesian framework, a posterior distribution of the most sensitive parameters are derived from the a priori distributions and a non-linear proxy model of the likelihood function. The later is computed using experimental design, kriging and dynamic training techniques. Multiple History Matched models together with a posteriori parameter distributions are finally used in a joint modeling approach to capture the main uncertainties and to obtain typical (P10-P90) probabilistic production profiles. This workflow has been applied to a gas storage real case submitted to significant seasonal pressure variations. Probabilistic operational pressure profiles for a given period can then be compared to the actual gas storage dynamic behaviour to assess the added value of the proposed workflow. Introduction Getting probabilistic production forecasts of a reservoir through a risk analysis is closely linked to uncertainty quantification 1. Uncertainty quantification should end with a posteriori uncertain parameter distributions, reflecting all the knowledge that we have on the reservoir and explaining as much as possible the observation data. The uncertainties may result both from the observation data quality as well as from the numerical modeling steps. On top of that, the fluid flows equations we are dealing with are non linear, which leads to possible complex hydrodynamical behaviour. CPU constraints make inefficient brute force Monte Carlo sampling of the a posteriori distributions from a priori ones. Most of the current approaches try to scan more efficiently (in a given timeframe) the uncertain parameters space using reliable proxy models 2 of the fluid flow simulator and/or to take advantage of new computational power 3,4,5,6. Due to the numerous uncertain parameters involved in the history matching process, a screening of the parameters space is first performed (i.e. a sensitivity study) to retain only the most sensitive ones with respect to the history matching criteria, possibly reducing the variation intervals. This is a pre-processing step of the history matching job.
History matching is widely considered as the most time- and resource-consuming phase of reservoir simulation modeling. Even with the advent of modern, computer-assisted, history matching methods, the dynamic calibration of large-scale simulation models represents a considerable computational undertake. The challenges become even more pronounced with incorporation of subsurface and production uncertainty. This paper outlines a step forward in acceleration of reservoir simulation studies by applying a split/merge approach constrained by no-flow boundary drainage region. The method transforms the history-matching process into an accelerated progressive sequence of dynamic model updates in time and space. Each segment defined as distinctive drainage region, the boundaries of the drainage regions are mapped based on no-flow conditions. Each segment is dynamically calibrated and history matched simultaneously in parallel. Lastly, the segments are merged back to reconstruct the original model to run the prediction phase. The detail of the workflow is described, as well as the implementation of the workflow in a synthetic model. A comparison between the conventional approach and the new approach is discussed. Recommendation and a way forward are shared to capitalize on the accelerated method for future reservoir studies.
Production from the North Sea reservoirs often results in a pressure decrease below the bubble point. The gas is liberated from oil, in the form of bubbles or as a continuous flowing phase. In such cases, the two phases, gas and oil, flow in the reservoir simultaneously, and the flow is governed by the values of relative permeabilities. Traditional core flooding in low permeability rocks is challenging, therefore we use a novel experimental approach to determine the oil relative permeabilities below the critical gas saturation. A mathematical model has been created to reconstruct both the gas and the oil relative permeabilities for the whole saturation range. Laboratory observations have shown that in low-permeable rocks the relative permeabilities may strongly decrease, even when the amount of the liberated gas is small. The goal of this work is to verify, on a specific example, whether the designed model for the relative permeabilities may explain the observed production behavior for a low-permeable chalk reservoir in the North Sea. We perform a sensitivity study using the parameters of relative permeabilities and analyze the corresponding differences in well productivities. A reasonably good match of <10% can be obtained to the historical well production data. A few cases where the match was not satisfactory (14% to 65%) are also analyzed, and the difference is attributed to the imprecise fluid model. The developed experimental and modeling methodology may be applied to other reservoirs developed by the solution gas drive mechanism.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.