Over the last twenty or more years of reservoir performance prediction through simulation there have been only two fundamental changes. First was the evolutionary increase in computing speed that has allowed larger, more detailed reservoir models to be built. Second was the revolutionary change in approach that involved the entire subsurface community in building integrated reservoir descriptions. The next big change may in time prove to be BP's Top-Down Reservoir Modelling (TDRM). This is a new pragmatic approach to fully incorporate reservoir uncertainty in model construction and performance prediction. TDRM is proprietary technology that has been developed in BP through extensive R&D, and consists of a philosophy and tools that enable a faster and more robust exploration of uncertainty than has hitherto been possible. The philosophy is to start investigations with the simplest possible model and simulator appropriate to the business decision. Detail is added later as required. The approach overcomes the problems of the conventional "bottom-up" process, which uses detailed models that are too slow and cumbersome to fully explore uncertainty and identify critical issues. Highly detailed models cannot overcome an underlying absence of information, and can have the negative effect of creating a false sense of understanding. The TDRM tools have been designed to minimise manual iterations by creating a semi automated, flexible workflow for case management, assisted history matching, depletion planning optimisation and post-analysis. TDRM has been successfully applied to eighteen oil and gas reservoirs that range from development appraisal stage to mature fields, and has resulted in up to 20% increase in estimated net present value for the projects. Background The business imperatives in developing oil and gas reservoirs are faster pace and less risk from subsurface uncertainties. Quantification of the uncertainties is difficult and time consuming because of a) the intrinsic subsurface complexity, requiring integration of data from core to seismic scales (cm to 10's m), b) the sparseness of information requiring estimation of unknown data for the construction of possible geological and simulation models, and c) the need to consider a large number of development scenarios. Processes used to estimate uncertainties vary, but the general trend is to start by building a large (multi-million cell) geological model. Often the type of model is independent of the business decision, timeframe and amount of data available. Due to the complex workflow and effort, the focus is on building only one, the "most likely", detailed model, even though evidence from the data indicates that there are many possible models. The next step is to build a simulation model that typically involves upscaling the geological model. If production data exist, this simulation model is history matched manually. Iterative rebuilding of the underpinning geological model is generally avoided. Exploration of the uncertainty in performance prediction using the simulation model is often limited to one-at-a-time sensitivities around a base case. These sensitivities are only a small sample from the factorially combined possibilities. The effort to reach this stage is significant and can be many months for a major reservoir decision. Overall, the focus of activity has been building ever more complex (hence apparently realistic) models and predicting performance from only a single realisation. Breaking away from this general approach and focusing on the real uncertainty breadth in performance prediction is a conceptual leap which requires new technology and understanding. Technology Improvements Technology improvements are providing better information about current and future reservoir performance and offer the opportunity to quantify the risk from subsurface uncertainties. Some of these advances are highlighted below.
Sum.m~ry. A the?ry of compositional viscous fingering with no adjustable parameters that reduces to the Todd and Longstaff model for mIscIble floo?s IS prese~ted. The t~~ory gi~es excellent predictive agreement with simulation results for a wide range of recovery ?rocesses. The hlgh-r~s~lutlOn compOSItIonal SImulations used to validate the theory are the first to resolve viscous fingering adequately III flow other than mIscIble flow.
The increased use of fine scale three dimensional geologic models has offered both new opportunities and new challenges for the reservoir engineer. These models offer new opportunities to assess the impact of reservoir heterogeneity on performance prediction. Through the use of multiple models, we are in better position to evaluate the uncertainty of these predictions. Performance prediction either relies on direct simulation (e.g., streamline techniques) or on simulation of upscaled models. Unfortunately, the fundamental development work for each has been performed in relatively simple flow geometries, whilst by their nature, geologic models tend to include more irregular geometries: layers that are removed by scour or by internal pinch-out, isopach thicknesses that vary significantly, and well trajectories that intersect many fine cells at arbitrary orientations to the local stratigraphy. We report on our experience, and the new theoretical advances required to resolve these difficulties. The most important advances have had to do with upscaling and transmissibility, although we also re-examine the management of pinched-out cells, and the calculation of well PI. A new upscaling formulation is introduced which emphasizes three dimensional permeability; it is especially well suited to upscaling from irregularly shaped regions. Re- sampling from the geologic grid to a computational grid has forced us to a new, more fundamental, derivation of transmissibility. Unlike the standard construction, it is guaranteed to never give a negative transmissibility. We also suggest a new treatment of pinched-out cells, which regularizes the vertical non-nearest neighbor connections. Finally, we revisit Peaceman's well PI equations. and show their generalization to inclined wells, full tensor permeability, and computational cells with arbitrary numbers of faces. Introduction The last three years have seen an explosive growth in the ability of the petroleum industry to develop flow simulation models based upon detailed three dimensional geologic descriptions. Until recently, such modeling activities have required access to research codes, and experts to utilize them, for example. With a wide range of vendor tools currently available (StrataModel, IRAP/RMS, Storm, RC**2, …) it is now possible to build such models within asset teams without the direct involvement of experienced technologists. At the same time, these tools provide new challenges to the technologist, as the simple (I,J,K) "shoebox" topologies which underlie many of the theoretical algorithms are either not present or have been substantially modified within the geologic framework. We find that it is in the transition from the geologic static model to flow simulation that our theoretical foundations, and the vendor products, are most in need of reevaluation. For this reason we will report upon our experience in performing flow simulation at the scale of the fine geologic model, and the enhanced theoretical understanding which has developed as a result. Four fundamental technical issues arise:the construction of the three dimensional geologic grid, which defines the data structure for all subsequent calculations,the definition of physical transport properties on this geologic grid, in particular, permeability and net-to-gross,their representation within a finite difference scheme in terms of transmissibility and well PI (well connection factor), andthe methodology for upscaling the flow properties. Aspects of these issues have been reported in the literature but with surprisingly little emphasis on the vertically complex geologic structures which have tested our theoretical understanding. New theoretical results are presented for each of these topics, with the most important ones having to do with upscaling and with transmissibility. Nonetheless, we reserve the discussion of these keys points until late in the paper, as they rely upon the other elements. The single most important advance involves the introduction of three dimensional permeability. which is based on a homogenized form of Darcy's equation. It extends the usual one dimensional upscaling computation, and is especially well suited to upscaling from irregularly shaped regions. P. 331^
To predict and manage petroleum production there are two challenges: (i) to generate and utilise tremendous detail in our reservoir description (ii) to recognise that the majority of this detail is uncertain. Techniques, therefore, are necessary to assess uncertainty while making predictions of oil, water, and gas production. We demonstrate that by using a mixture of old and new techniques (streamlines and fine grid simulation) that we obtain the speed of the first while retaining the rigor and accuracy of the second. The method is applied to the rapid evaluation of the impact of reservoir heterogeneity on miscible gas injection.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.