SPE Reservoir Simulation Symposium 2001
DOI: 10.2118/66394-ms
|View full text |Cite
|
Sign up to set email alerts
|

Scale Splitting Approach to Reservoir Characterization

Abstract: TX 75083-3836, U.S.A., fax 01-972-952-9435. AbstractProduction forecasts are essential for sound reservoir management. The foundation for such forecasts is a characterization of relevant reservoir properties. These properties are usually determined through history matching, using production data, static well data (hard data), and the upscaled geological model (prior model), simultaneously. This process if often very complex and costly, both in terms of CPU-time and man-hours. To reduce cost and complexity of r… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2003
2003
2020
2020

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 21 publications
(2 citation statements)
references
References 13 publications
0
2
0
Order By: Relevance
“…The importance of parametrization in subsurface simulations resulted in a variety of methods in the literature including zonation [20,24] and zonation-based methods [5,9,15,16,17,1], PCA-based methods [13,40,43,46,32,55], SVD-based methods [48,47,52,51], discrete wavelet transform [33,31,44], discrete cosine transform [21,22,23], level set methods [35,10,8], and dictionary learning [26,27]. Many methods begin by proposing parametric forms for the random vector to be modeled which are then explicitly fitted to preserve certain chosen statistics.…”
Section: Introductionmentioning
confidence: 99%
“…The importance of parametrization in subsurface simulations resulted in a variety of methods in the literature including zonation [20,24] and zonation-based methods [5,9,15,16,17,1], PCA-based methods [13,40,43,46,32,55], SVD-based methods [48,47,52,51], discrete wavelet transform [33,31,44], discrete cosine transform [21,22,23], level set methods [35,10,8], and dictionary learning [26,27]. Many methods begin by proposing parametric forms for the random vector to be modeled which are then explicitly fitted to preserve certain chosen statistics.…”
Section: Introductionmentioning
confidence: 99%
“…One way to attempt to deal with such problems is via reparameterization by reducing the number of model parameters (reservoir variables). This type of approach has a long history beginning with the pioneering work on zonation of Jacquard and Jain (1965) and Jahns (1966) and continuing today with work on adaptive parameterization; see, for example, Grimstad et al (2001). During the first year of this project report, we investigated reparameterization based on subspace methods (Kennett and Williamson, 1988;Oldenburg et al, 1993;Oldenburg and Li, 1994; to reduce the number of parameters directly estimated in the optimization process.…”
Section: Introductionmentioning
confidence: 99%