2006 IEEE International Conference on Evolutionary Computation
DOI: 10.1109/cec.2006.1688677
|View full text |Cite
|
Sign up to set email alerts
|

Curse and Blessing of Uncertainty in Evolutionary Algorithm Using Approximation

Abstract: Evolutionary frameworks that employ approximation models or surrogates for solving optimization problems with computationally expensive fitness functions may be referred as Surrogate-Assisted Evolutionary Algorithms (SAEA). In this paper, we present a study on the effects of uncertainty in the surrogate on SAEA. In particular, we focus on both the 'curse of uncertainty' and 'blessing of uncertainty' on evolutionary search, a notion borrowed from 'curse and blessing of dimensionality' in [1]. Here, the 'curse o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 23 publications
0
2
0
Order By: Relevance
“…However, the true best result after performing optimisation is obtained from the use of hybrid Kriging-support vector regression (KG-SVR). This situation is called the blessing and curse of uncertainties [14]. Figure 5 shows comparison between the design solutions from the conventional circular wire drawing, NCD, optimum NCD using KG-SVR, and hybrid Polynomial response surface (PRS), Radial basis function (RBF), KG and SVR.…”
mentioning
confidence: 99%
“…However, the true best result after performing optimisation is obtained from the use of hybrid Kriging-support vector regression (KG-SVR). This situation is called the blessing and curse of uncertainties [14]. Figure 5 shows comparison between the design solutions from the conventional circular wire drawing, NCD, optimum NCD using KG-SVR, and hybrid Polynomial response surface (PRS), Radial basis function (RBF), KG and SVR.…”
mentioning
confidence: 99%
“…In the first step of this analysis, the Pearson correlation coefficient is calcu- The results confirms the expectations that as the source-target similarities increase, there is a corresponding monotonic improvement in the overall optimization performance. To further verify the assertion that this improvement is a consequence of improved modelling accuracy (due to knowledge transfer), rather than, for example, the blessings of uncertainty [151], the generalization error (RMSE) of the TS-GP model is compared with GP. Specifically, as shown in Note that GP subscript in the "Generalization Performance Gain" function refers to the no-transfer case.…”
Section: Investigating the Effects Of Source-target Similaritymentioning
confidence: 99%