2017 Winter Simulation Conference (WSC) 2017
DOI: 10.1109/wsc.2017.8247911
|View full text |Cite
|
Sign up to set email alerts
|

Deep Gaussian Process metamodeling of sequentially sampled non-stationary response surfaces

Abstract: Simulations are often used for the design of complex systems as they allow to explore the design space without the need to build several prototypes. Over the years, the simulation accuracy, as well as the associated computational cost has increased significantly, limiting the overall number of simulations during the design process. Therefore, metamodeling aims to approximate the simulation response with a cheap-to-evaluate mathematical approximation, learned from a limited set of simulator evaluations. Kernel-… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
7
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 15 publications
0
7
0
Order By: Relevance
“…The performance measures are calculated using the formulae shown in numbers 6 through 9. Also, the loss function 58 was used as a performance metric.…”
Section: Model Evaluationmentioning
confidence: 99%
See 1 more Smart Citation
“…The performance measures are calculated using the formulae shown in numbers 6 through 9. Also, the loss function 58 was used as a performance metric.…”
Section: Model Evaluationmentioning
confidence: 99%
“…Deep Gaussian processes are now used to solve the drawbacks of Gaussian process models. 15 Iteratively convolving several GP functions across an image is accomplished using a deep convolutional Gaussian process. 16 Traditional (single-layer) GPs have limitations, and a non-parametric deep Bayesian method known as a deep Gaussian process (DGP) uses a hierarchical composition of GPs to address these limitations while maintaining the benefits of GPs.…”
Section: Introductionmentioning
confidence: 99%
“…Deep Gaussian Process Regression (DGPR) is evolved on the basis of GPR which has been widely used due to its analytical tractability, a training procedure which automatically guards against overfitting and ability to provide a full predictive distribution over the outputs [22]. In detail, the GPR implementation can be realized by an analytic solution, but the selection of a single kernel function may restrict the prediction ability of the model.…”
Section: Deep Gaussian Process Regressionmentioning
confidence: 99%
“…Different from the single-layer GPR which can only represent a restricted class of functions, DGPR is a statistical multi-layer hierarchical model of GPR [23] which the output of the current layer acts as the next layer input and enables a deep probabilistic nonparametric approach to flexibly tackle complex machine-learning problems with sound quantification of uncertainty [20]. In other words, DGPR retains useful properties of GPR [22] and overcomes the limitations of GPR, which is suited for representation of highly complex data relationships.…”
Section: Deep Gaussian Process Regressionmentioning
confidence: 99%
“…Some inroads have been made along these lines. Dutordoir et al (2017) fit DGPs to sequentially collected data, but acquisition criteria were not based on the DGP fits. Rajaram et al (2020) suggested a maximum variance criterion, a strategy sometimes called "active learning MacKay" (ALM; MacKay, 1992), with DGPs.…”
Section: Introductionmentioning
confidence: 99%