2015
DOI: 10.1080/0305215x.2015.1082350
|View full text |Cite
|
Sign up to set email alerts
|

Trust regions in Kriging-based optimization with expected improvement

Abstract: The Kriging-based Efficient Global Optimization (EGO) method works well on many expensive blackbox optimization problems. However, it does not seem to perform well on problems with steep and narrow global minimum basins and on high-dimensional problems. This article develops a new Krigingbased optimization method called TRIKE (Trust Region Implementation in Kriging-based optimization with Expected improvement) that implements a trust-region-like approach where each iterate is obtained by maximizing an Expected… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
18
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 43 publications
(19 citation statements)
references
References 51 publications
1
18
0
Order By: Relevance
“…(i) Grid search: a grid of discrete couples θ, λ is generated, all the possible combinations of parameter values are evaluated, and the best combination (in terms of minimum ε LOO ) is retained (ii) Numerical optimization: the minimization problem (18) is solved considering the hyperparameters as continuous variables and using the same algorithms for searching the Kriging hyperparameters…”
Section: International Journal Of Aerospace Engineeringmentioning
confidence: 99%
See 1 more Smart Citation
“…(i) Grid search: a grid of discrete couples θ, λ is generated, all the possible combinations of parameter values are evaluated, and the best combination (in terms of minimum ε LOO ) is retained (ii) Numerical optimization: the minimization problem (18) is solved considering the hyperparameters as continuous variables and using the same algorithms for searching the Kriging hyperparameters…”
Section: International Journal Of Aerospace Engineeringmentioning
confidence: 99%
“…SBO consists in replacing the high-fidelity model (or "truth" model, e.g., the CFD simulation) with a fast, lower-fidelity model which has preliminarily "learned" from high-fidelity data. Since the pioneering work by Jones et al [1], several theoretical studies [2][3][4][5][6][7][8][9][10][11][12][13][14][15][16][17][18] have been published on the topic. The proposed methods differ for one of the following items: the employed surrogate model (e.g., model type and single or multiple models), the training approach (e.g., optimizing the prediction error, the cross-validation error, the generalized cross-validation error, and the likelihood function), the model updating strategy (e.g., usage of surrogate minimizers, infill criteria, and random criteria), and the optimization method adopted to find the model parameters and to explore the surrogate (e.g., heuristic, gradient-free or gradient-based, and global or local).…”
Section: Introductionmentioning
confidence: 99%
“…It has been demonstrated that surrogate-based optimization displays superior performance for derivative-free optimization problems [41]. Most popular surrogate models implemented for optimization methods are RBF [42][43][44][45][46] and Kriging [47][48][49][50][51], due to their capability to provide prediction uncertainty. Artificial neural networks (ANN) have excellent fitting characteristics with low complexity, therefore implementations of ANN for surrogate-based optimization (SBO) is popular for various engineering applications [22,[52][53][54].…”
Section: A Brief Review Of Surrogate-based Optimizationmentioning
confidence: 99%
“…Sun et al 22 further proposed a superlinearly convergent trust region-sequential quadratic programming (QP) approach. The method incorporates a combination algorithm that allows both the trust region technique 23,24 and the sequential QP method to be used. It avoids solving the QP subproblem for nonlinear constrained optimization problems, which gives the potential for fast convergence in the neighborhood of an optimal solution.…”
Section: Introductionmentioning
confidence: 99%