2021
DOI: 10.1007/s10596-021-10046-1
|View full text |Cite
|
Sign up to set email alerts
|

Novel iterative ensemble smoothers derived from a class of generalized cost functions

Abstract: Iterative ensemble smoothers (IES) are among the state-of-the-art approaches to solving history matching problems. From an optimization-theoretic point of view, these algorithms can be derived by solving certain stochastic nonlinear-least-squares problems. In a broader picture, history matching is essentially an inverse problem, which is often ill-posed and may not possess a unique solution. To mitigate the ill-posedness, in the course of solving an inverse problem, prior knowledge and domain experience are of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
31
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 22 publications
(31 citation statements)
references
References 33 publications
0
31
0
Order By: Relevance
“…See Equation (49) in Luo (2021) for detailed derivations of the identify above. In general, the observation operator H is nonlinear, in which case the square root matrix S y as estimated in Equation (A 1) provides a derivative-free approximation to the projected square root matrix HS w .…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…See Equation (49) in Luo (2021) for detailed derivations of the identify above. In general, the observation operator H is nonlinear, in which case the square root matrix S y as estimated in Equation (A 1) provides a derivative-free approximation to the projected square root matrix HS w .…”
Section: Discussionmentioning
confidence: 99%
“…Note that this method involves the Hessian of the cost function (Evensen 2018;Luo 2021) and provides quantified uncertainties based on Bayesian analysis (Zhang et al 2020). Similar to the ensemble gradient method, the ensemble Kalman inversion method also approximates the sensitivity of velocity to neural-network weights based on the ensemble cross-covariance matrix, without involving the analytic gradient of the neural network.…”
Section: Variousmentioning
confidence: 99%
“…The MAC problem in Eq. 2 can be approximately solved through the following IES model update formula [23,25]:…”
Section: Generalized Iterative Ensemble Smoother (Gies)mentioning
confidence: 99%
“…5 contains two equivalent ways of expressing the Kalmangain matrix K i . The first expression is the one often used in the literature, whereas the second expression can be derived by applying a certain matrix identity to the first one [23].…”
Section: Generalized Iterative Ensemble Smoother (Gies)mentioning
confidence: 99%
See 1 more Smart Citation