2017
DOI: 10.1016/j.asoc.2017.07.060
|View full text |Cite
|
Sign up to set email alerts
|

Self-adjusting parameter control for surrogate-assisted constrained optimization under limited budgets

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
49
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 58 publications
(49 citation statements)
references
References 28 publications
0
49
0
Order By: Relevance
“…If the functions are non-smooth or noisy, it is likely that GP surrogate degrades rapidly and overfits due to its interpolating behavior. A challenge for optimization under restricted budgets will be to find the right degree of approximation (smoothing factor) from a limited number of samples [ 102 ].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…If the functions are non-smooth or noisy, it is likely that GP surrogate degrades rapidly and overfits due to its interpolating behavior. A challenge for optimization under restricted budgets will be to find the right degree of approximation (smoothing factor) from a limited number of samples [ 102 ].…”
Section: Discussionmentioning
confidence: 99%
“…As Table 9 states, the proposed algorithm shows a significant improvement over GWO for θ = 0. In many studies on optimization, the strength of an optimization technique is measured by comparing the final solution achieved by different algorithms [101,102]. This approach only provides information about the quality of the results and neglects the speed of convergence which is a very important measure for expensive optimization problems.…”
Section: Plos Onementioning
confidence: 99%
“…In many studies on optimization, the strength of an optimization technique is measured by comparing the final solution achieved by different algorithms. 93,94 This approach only provides information about the quality of the results and neglects the speed of convergence which is a very important measure for expensive optimization problems. Comparing the convergence curves (number of function evaluations) is also one of the common benchmarking approaches.…”
Section: Performance Measurementioning
confidence: 99%
“…One of the major disadvantages of parameter tuning, compared to parameter control, is the lack of flexibility and relevance, as population changes dynamically during the search process while parameters remain unchanged, preventing the population from being updated accurately. Contrary to parameter tuning, parameter control is far from being thoroughly researched [15]. The methods of parameter control have generally been classified into three categories since 1999 [8], including deterministic, adaptive and self-adaptive methods.…”
Section: Related Workmentioning
confidence: 99%