2001
DOI: 10.1007/s00158-001-0160-4
|View full text |Cite
|
Sign up to set email alerts
|

Comparative studies of metamodelling techniques under multiple modelling criteria

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

4
510
0
3

Year Published

2007
2007
2024
2024

Publication Types

Select...
6
4

Relationship

0
10

Authors

Journals

citations
Cited by 1,405 publications
(517 citation statements)
references
References 27 publications
4
510
0
3
Order By: Relevance
“…These factors are the noise level and the allowed budget of evaluations (a similar classification can be found in [18]). For the tuning factors, we selected the ones which have been changed within different studies [2, 4], i. e., the proportion of observations for the initial DOE and the choice of the covariance kernel.…”
Section: Algorithmic Factorsmentioning
confidence: 99%
“…These factors are the noise level and the allowed budget of evaluations (a similar classification can be found in [18]). For the tuning factors, we selected the ones which have been changed within different studies [2, 4], i. e., the proportion of observations for the initial DOE and the choice of the covariance kernel.…”
Section: Algorithmic Factorsmentioning
confidence: 99%
“…Since the real model is assumed to be deterministic (i.e., simulations of the same input configuration lead to the same output), it is desirable that the meta-model predicts as well the exact output value in correspondence of the training configurations (i.e., those known with absolute certainty). In this respect, among the numerous methods available in the literature (Jin et al, 2001;Shan & Wang, 2010), we resort to Kriging (Kleijnen, 2009;Matheron, 1963), i.e., Gaussian process modeling. Actually, Kriging is capable of modeling local behaviors of the response function and of diversifying the levels of accuracy of the same model within different regions.…”
Section: Meta-modelingmentioning
confidence: 99%
“…According to [33,34] cross-validation error is a common choice to measure the accuracy of the metamodel.…”
Section: Accuracy Metricsmentioning
confidence: 99%