Applications of Statistics and Probability in Civil Engineering 2011
DOI: 10.1201/b11332-100
|View full text |Cite
|
Sign up to set email alerts
|

Metamodel-based importance sampling for the simulation of rare events

Abstract: In the field of structural reliability, the Monte-Carlo estimator is considered as the reference probability estimator. However, it is still untractable for real engineering cases since it requires a high number of runs of the model. In order to reduce the number of computer experiments, many other approaches known as reliability methods have been proposed. A certain approach consists in replacing the original experiment by a surrogate which is much faster to evaluate. Nevertheless, it is often difficult (or e… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

1
16
0

Year Published

2014
2014
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 17 publications
(17 citation statements)
references
References 17 publications
1
16
0
Order By: Relevance
“…This indicator can be directly used to refine the model, i.e., to choose new points to evaluate the real function that allow to improve the accuracy of the model. Kriging has been extensively used with classical Monte Carlo estimator [152], Importance sampling method [145,[153][154][155], importance sampling with control variates [156] or subset simulation [157][158][159]. The way to refine the Kriging model is a key point and different strategies have been proposed [155,160,161] to exploit the complete probabilistic description given by the Kriging to evaluate the minimal number of points on the real expensive input-output function.…”
Section: Use Of Metamodels In Rare Event Probability Estimationmentioning
confidence: 99%
“…This indicator can be directly used to refine the model, i.e., to choose new points to evaluate the real function that allow to improve the accuracy of the model. Kriging has been extensively used with classical Monte Carlo estimator [152], Importance sampling method [145,[153][154][155], importance sampling with control variates [156] or subset simulation [157][158][159]. The way to refine the Kriging model is a key point and different strategies have been proposed [155,160,161] to exploit the complete probabilistic description given by the Kriging to evaluate the minimal number of points on the real expensive input-output function.…”
Section: Use Of Metamodels In Rare Event Probability Estimationmentioning
confidence: 99%
“…, q T also relates to the original subset simulation algorithm. Indeed, note that q t (x) ∝ E nt 1 ξ>ut π X , and recall from Equation (3) that q t ∝ 1 ξ>ut π X is the target distribution used in the subset simulation algorithm at stage t. This choice of instrumental density is also used by [27,28] in the context of a twostage adaptive importance sampling algorithm. This is indeed a quite natural choice, sincẽ q t ∝ 1 ξ>ut π X is the optimal instrumental density for the estimation of α t by importance sampling [see, e.g., 49, Theorem 3.12].…”
Section: 2mentioning
confidence: 99%
“…The first issue-designing efficient algorithms to estimate α in the case of an expensiveto-evaluate limit-state function-can be seen as a problem of design and analysis of computer experiments (see, e.g., [50]), bearing some similarities to the problem of global optimization (see [53] and references therein). Several sequential design strategies based on Gaussian process models have been proposed in the literature, and spectacular evaluation savings have been demonstrated on various examples with moderately small α (typically, 10 −2 or 10 −3 ); see [6] for a review of fully sequential strategies and [3,27] for examples of two-stage strategies. The closely related problem of quantile estimation has also been investigated along similar lines [1,13,46].…”
mentioning
confidence: 99%
“…Meta-model based algorithms. As modified Monte-Carlo methods seen above still require an important number of samples and do not allow for full parallelisation, meta-model based algorithms propose to spend the computational budget in fitting a surrogate model to the expensive-to-evaluate function g and then to use it instead of the true function to compute probability estimation with usual methods [14,5,13]. Thus theses strategies highly depend 35 on the quality of the Design of Experiments (DoE) and especially on their ability to predict the boundary between safety and failure domains, ie.…”
Section: Introductionmentioning
confidence: 99%