2014
DOI: 10.1007/s11004-014-9573-7
|View full text |Cite
|
Sign up to set email alerts
|

Fast Update of Conditional Simulation Ensembles

Abstract: Gaussian random fields (GRF) conditional simulation is a key ingredient in many spatial statistics problems for computing Monte-Carlo estimators and quantifying uncertainties on non-linear functionals of GRFs conditional on data. Conditional simulations are known to often be computer intensive, especially when appealing to matrix decomposition approaches with a large number of simulation points. Here we study the settings where conditioning observations are assimilated batch-sequentially, i.e. one point or bat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
30
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 23 publications
(31 citation statements)
references
References 24 publications
1
30
0
Order By: Relevance
“…First, it is important to study the impact of the initial design and the step size on the predictive performance of SK applied with the sequential strategies proposed, with an emphasis on the roles that the step size ∆n and the initial number of simulation replications n play. Second, a natural extension is to consider augmenting an initial design in a batch sequential manner, that is, to add several new design points to the existing design-point set on each iteration (see, e.g., Loeppky et al, 2010), which will facilitate exploiting the distributed computing power to enable a fast parallel update of SK metamodels (see Chevalier et al, 2015). Third, for high dimensional problems, it is imperative that we are able to construct an adequate candidate designpoint set and to update the set adaptively.…”
Section: Discussionmentioning
confidence: 99%
“…First, it is important to study the impact of the initial design and the step size on the predictive performance of SK applied with the sequential strategies proposed, with an emphasis on the roles that the step size ∆n and the initial number of simulation replications n play. Second, a natural extension is to consider augmenting an initial design in a batch sequential manner, that is, to add several new design points to the existing design-point set on each iteration (see, e.g., Loeppky et al, 2010), which will facilitate exploiting the distributed computing power to enable a fast parallel update of SK metamodels (see Chevalier et al, 2015). Third, for high dimensional problems, it is imperative that we are able to construct an adequate candidate designpoint set and to update the set adaptively.…”
Section: Discussionmentioning
confidence: 99%
“…In addition, techniques for simulating efficiently over more points and updating simulations with new observations [40] should be considered, as well as optimization of simulation points locations with re-interpolation or re-simulation [41]. Another direction for future research includes the integration of the proposed uncertainty estimate in a Stepwise Uncertainty Reduction (SUR) strategy [42] as an infill criterion.…”
Section: Discussionmentioning
confidence: 99%
“…A framework with a similar aim has been recently proposed by (Chevalier et al, 2014) to update the conditional simulations at minimal cost. The formulae offers significant computational savings when the number of conditioning observations is large, and quantifies the effect of the newly assimilated observations on already simulated sample paths.…”
Section: A Methods For Updating Coal Attributes In a Short-term Model mentioning
confidence: 99%