2017
DOI: 10.1007/s11831-017-9226-3
|View full text |Cite
|
Sign up to set email alerts
|

An Overview of Gradient-Enhanced Metamodels with Applications

Abstract: Metamodeling, the science of modeling functions observed at a finite number of points, benefits from all auxiliary information it can account for. Function gradients are a common auxiliary information and are useful for predicting functions with locally changing behaviors. This article is a review of the main metamodels that use function gradients in addition to function values. The goal of the article is to give the reader both an overview of the principles involved in gradientenhanced metamodels while also p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
48
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
6
1
1

Relationship

2
6

Authors

Journals

citations
Cited by 69 publications
(52 citation statements)
references
References 110 publications
(157 reference statements)
0
48
0
Order By: Relevance
“…After initial sample points are chosen and the function value of the respective points is obtained, a surrogate is created. Convergence of this is tested by using any kind of quality test such as R 2 or Q 3 criteria [15]. Depending on the type of function/problem one faces, different strategies can be utilized to find additional sample points, should the convergence test fail.…”
Section: Restricted Optimizationmentioning
confidence: 99%
See 2 more Smart Citations
“…After initial sample points are chosen and the function value of the respective points is obtained, a surrogate is created. Convergence of this is tested by using any kind of quality test such as R 2 or Q 3 criteria [15]. Depending on the type of function/problem one faces, different strategies can be utilized to find additional sample points, should the convergence test fail.…”
Section: Restricted Optimizationmentioning
confidence: 99%
“…Depending on the type of function/problem one faces, different strategies can be utilized to find additional sample points, should the convergence test fail. This procedure is repeated until enough data is collected for the surrogate model to fulfill the above mentioned criteria and thus convergence is reached [15].…”
Section: Restricted Optimizationmentioning
confidence: 99%
See 1 more Smart Citation
“…The major drawback of this method is the wrong variance associated to lowfidelity corrected data, which is treated as zero. Benchmarks of these methods can be found in [6,24]. With these methods, results from multiple solvers can be well exploited.…”
Section: Algorithm 1 Evofusion Algorithmmentioning
confidence: 99%
“…However, due to money and/or time costs, it is practically infeasible to experiment or numerically simulate every feasible design point; thus, based on obtained results, different techniques to predict the outcome at a specified point have been developed, among which, one of the most widely used types is surrogate models, such as polynomial response surfaces [8], Kriging, gradient-enhanced Kriging (GEK) [9], radial basis function [10], support vector machine [11] et al With the constructed approximation models, an optimization procedure is consequently used to find the optimal result. Optimization methods arise from optimal objectives, and they are becoming essential in every field of research.…”
Section: Introductionmentioning
confidence: 99%