2008
DOI: 10.1198/004017008000000541
|View full text |Cite
|
Sign up to set email alerts
|

Sequential Experiment Design for Contour Estimation From Complex Computer Codes

Abstract: Computer simulation often is used to study complex physical and engineering processes. Although a computer simulator often can be viewed as an inexpensive way to gain insight into a system, it still can be computationally costly. Much of the recent work on the design and analysis of computer experiments has focused on scenarios where the goal is to fit a response surface or process optimization. In this article we develop a sequential methodology for estimating a contour from a complex computer code. The appro… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
210
0
2

Year Published

2011
2011
2018
2018

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 228 publications
(213 citation statements)
references
References 25 publications
1
210
0
2
Order By: Relevance
“…The expected feasibility function (EFF) is introduced here to provide an indication of how well the true value of the response is expected to satisfy the equality constraint G(u) =z. Inspired by the contour estimation work in [67], this expectation can be calculated in a similar fashion as Eq. 1.52 by integrating over a region in the immediate vicinity of the threshold valuez ± :…”
Section: Expected Feasibility Functionmentioning
confidence: 99%
“…The expected feasibility function (EFF) is introduced here to provide an indication of how well the true value of the response is expected to satisfy the equality constraint G(u) =z. Inspired by the contour estimation work in [67], this expectation can be calculated in a similar fashion as Eq. 1.52 by integrating over a region in the immediate vicinity of the threshold valuez ± :…”
Section: Expected Feasibility Functionmentioning
confidence: 99%
“…For example, Ranjan et al (2008Ranjan et al ( , with corrections in 2011, Bichon et al (2008), and Picheny et al (2010), all devised such adaptive sampling algorithms based on Kriging and its uncertainty model. Some of these strategies are motivated by reliability calculations rather than optimization, so that the algorithm developed by Bichon et al (2008) is called EGRA for efficient global reliability analysis.…”
Section: Problems With Constraintsmentioning
confidence: 99%
“…Most, if not all reviewed work relies on sequential sampling for the gradual improvement of the surrogate in the vicinity of the failure domain. It is worth mentioning that some authors such as [3,4] have explored those strategies for the accurate estimation of target regions in general. They consider that the emulator should only be refined in the regions of interest and that in order to obtain accurate reliability estimates a good quality surrogate is all that is needed.…”
Section: Introductionmentioning
confidence: 99%
“…These issue are addressed in the present article.The other major part of all adaptive algorithms is the stopping condition. This ranges from the use of reliability indices [7,8] through error in the estimation of the failure probability [5,13,15,16,17] and forms of measure of the discrepancy between the GPE predictions and code observations [4,6,9,18] to thresholds on the learning function [3,12,14]. Most frameworks use some form of statistic related to the surrogate, which, depending on the use and complexity of the problem, could prove insufficiently robust.…”
Section: Introductionmentioning
confidence: 99%