TX 75083-3836, U.S.A., fax 01-972-952-9435. AbstractDevelopment studies examine the importance of geologic, engineering, and economic parameters to formulate and optimize production plans. If there are many factors, these studies are prohibitively expensive unless simulation runs are chosen and analyzed efficiently.
Well data reveal reservoir layering with relatively high vertical resolution but are areally sparse, whereas seismic data have low vertical resolution but are areally dense. Improved reservoir models can be constructed by integrating these data. The proposed method combines stochastic seismic inversion, finer-scale well data, and geologic continuity models to build ensembles of geomodels. Stochastic seismic inversions operating at the mesoscale (˜10 m) generate rock property estimates that are consistent with regional rock physics and true-amplitude imaged seismic data. These can be used in a cascading work? ow to generate ensembles of fine-scale reservoir models, wherein each realization from the Bayesian seismic inversion is treated as an exact constraint for an associated finer scale stochastic model. We use two-point statistical models for the fine-scale model, modeling thickness and porosity of multiple facies directly. The update of these fine-scale models by the seismic constraints yields highly correlated truncated Gaussian distributions. These generate potentially rich pinchout behavior and ? exible spatial connectivities in the fine scale model. The seismic constraints confine the fine-scale models to a posterior subspace corresponding to the constraint hypersurface. A Markov Chain Monte Carlo samples the posterior distribution in this subspace using projection methods that exploit the reduced dimensionality that comes with the exact constraints. These methods are demonstrated in three-dimensional ? ow simulations on a cornerpoint grid, illustrating the effects of stratigraphic variability on flow behavior. Introduction Reservoirs are sparsely sampled by well penetrations, but seismic survey results provide controls for reservoir stratigraphy and properties such as average porosity. However, beds thinner than about 1/8 to 1/4 the dominant seismic wavelength cannot be resolved in these surveys.1,2 At a depth of 3000 m, the maximum frequency in the signal is typically about 40 Hz and for average velocities are circa 2,000 m/s; this translates to best resolutions of about 10 m. The resolution limits and errors inherent in seismic-derived estimates complicate use of seismic inversion data.3 Mesoscale (˜10 m) reservoir models obtained by seismic inversion using rock-physics concepts and effective-media ideas are a manageable basis for Bayesian seismic integration because seismic is usefully informative at this scale as explained above. An attractive route to typical geocellular scale (˜1m)models is downscaling mesoscale models to meter-scale models using constraint equations embodying the effective media laws. In particular, downscaling specific realizations drawn from the posterior of a stochastic mesoscale inversion produces sum or average constraint equations for fine scale models. We use probabilistic depth and thickness information originating from the layer-based seismic inversion code DELIVERY4 as input to a downscaling algorithm operating on a cornerpoint grid. Seismic constraints and priors are modeled on the quasivertical block edges, analogous to seismic traces. Simulation at the edges preserves geometric detail required for cornerpoint reservoir models used in many commercial reservoir simulators (e.g., ECLIPSE5). Block-center properties such as porosity are obtained by averaging the edge properties.
TX 75083-3836, U.S.A., fax 01-972-952-9435. AbstractWell data reveal reservoir layering with relatively high vertical resolution but are areally sparse, whereas seismic data have low vertical resolution but are areally dense. Improved reservoir models can be constructed by integrating these data. The proposed method combines stochastic seismic inversion, finerscale well data, and geologic continuity models to build ensembles of geomodels.Stochastic seismic inversions operating at the mesoscale (≈10 m) generate rock property estimates that are consistent with regional rock physics and true-amplitude imaged seismic data. These can be used in a cascading workflow to generate ensembles of fine-scale reservoir models, wherein each realization from the Bayesian seismic inversion is treated as an exact constraint for an associated finer scale stochastic model. We use two-point statistical models for the fine-scale model, modeling thickness and porosity of multiple facies directly. The update of these fine-scale models by the seismic constraints yields highly correlated truncated Gaussian distributions. These generate potentially rich pinchout behavior and flexible spatial connectivities in the fine scale model.The seismic constraints confine the fine-scale models to a posterior subspace corresponding to the constraint hypersurface. A Markov Chain Monte Carlo samples the posterior distribution in this subspace using projection methods that exploit the reduced dimensionality that comes with the exact constraints.These methods are demonstrated in three-dimensional flow simulations on a cornerpoint grid, illustrating the effects of stratigraphic variability on flow behavior.
Summary Development studies examine geologic, engineering, and economic factors to formulate and optimize production plans. If there are many factors, these studies are prohibitively expensive unless simulation runs are chosen efficiently. Experimental design and response models improve study efficiency and have been widely applied in reservoir engineering. To approximate nonlinear oil and gas reservoir responses, designs must consider factors at more than two levels—not just high and low values. However, multilevel designs require many simulations, especially if many factors are being considered. Partial factorial and mixed designs are more efficient than full factorials, but multilevel partial factorial designs are difficult to formulate. Alternatively, orthogonal arrays (OAs) and nearly-orthogonal arrays (NOAs) provide the required design properties and can handle many factors. These designs span the factor space with fewer runs, can be manipulated easily, and are appropriate for computer experiments. The proposed methods were used to model a gas well with water coning. Eleven geologic factors were varied while optimizing three engineering factors. An NOA was specified with three levels for eight factors and four levels for the remaining six factors. The proposed design required 36 simulations compared to 26,873,856 runs for a full factorial design. Kriged response surfaces are compared to polynomial regression surfaces. Polynomial-response models are used to optimize completion length, tubinghead pressure, and tubing diameter for a partially penetrating well in a gas reservoir with uncertain properties. OAs, Hammersley sequences (HSs), and response models offer a flexible, efficient framework for reservoir simulation studies. Complexity of Reservoir Studies Reservoir studies require integration of geologic properties, drilling and production strategies, and economic parameters. Integration is complex because parameters such as permeability, gas price, and fluid saturations are uncertain. In exploration and production decisions, alternatives such as well placement, artificial lift, and capital investment must be evaluated. Development studies examine these alternatives, as well as geologic, engineering, and economic factors to formulate and optimize production plans (Narayanan et al. 2003). Reservoir studies may require many simulations to evaluate the many factor effects on reservoir performance measures, such as net present value (NPV) and breakthrough time. Despite the exponential growth of computer memory and speed, computing accurate sensitivities and optimizing production performance is still expensive, to the point that it may not be feasible to consider all alternative models. Thus, simulation runs should be chosen as efficiently as possible. Experimental design addresses this problem statistically, and along with response models, it has been applied in engineering science (White et al. 2001; Peng and Gupta 2004; Peake et al. 2005; Sacks et al. 1989a) toMinimize computational costs by choosing a small but statistically representative set of simulation runs for predicting responses (e.g., recovery)Decrease expected error compared with nonoptimal simulation designs (i.e., sets of sample points)Evaluate sensitivity of responses to varying factorsTranslate uncertainty in input factors to uncertainty in predicted performance (i.e., uncertainty analysis)Estimate value of information to focus resources on reducing uncertainty in factors that have the most significant effect on response uncertainty to help optimize engineering factors.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.