2014
DOI: 10.1007/978-3-319-10762-2_30
|View full text |Cite
|
Sign up to set email alerts
|

On the Effectiveness of Sampling for Evolutionary Optimization in Noisy Environments

Abstract: Sampling has been often employed by evolutionary algorithms to cope with noise when solving noisy real-world optimization problems. It can improve the estimation accuracy by averaging over a number of samples, while also increasing the computation cost. Many studies focused on designing efficient sampling methods, and conflicting empirical results have been reported. In this paper, we investigate the effectiveness of sampling in terms of rigorous running time, and find that sampling can be ineffective. We prov… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
33
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 26 publications
(33 citation statements)
references
References 25 publications
0
33
0
Order By: Relevance
“…Ensemble learning (EL) is the strategic generation and combination of multiple models, such as classifiers or experts, to solve particular computational intelligence problems [43]. EL is primarily used to improve the performance (e.g., classification, prediction, and function approximation) of a model, or reduce the likelihood of an unfortunate selection of a poor one.…”
Section: B Ensemble Learningmentioning
confidence: 99%
“…Ensemble learning (EL) is the strategic generation and combination of multiple models, such as classifiers or experts, to solve particular computational intelligence problems [43]. EL is primarily used to improve the performance (e.g., classification, prediction, and function approximation) of a model, or reduce the likelihood of an unfortunate selection of a poor one.…”
Section: B Ensemble Learningmentioning
confidence: 99%
“…We remark that Qian et al [21] has investigated the impact of the resampling in the 1+1-EA on the expected first hitting time. They have concluded in Theorem 2 of [21] that the resampling is useless for 1 + 1-EA optimizing the OneMax with Gaussian noise, i.e., the expected first hitting time will increase as k increases.…”
Section: Runtime Analysis With Resampling -Gaussian Noisementioning
confidence: 98%
“…There are several variants for choosing k, such as: taking a fixed k, or incrementing k as the iteration does, or adapting k during the optimization. The impact of resampling on the convergence rate has been empirically or theoretically investigated in the references [23,1,12,22,21]. We here focus on the adaptation of resampling works from continuous codomains [2] to discrete ones and we cover a broad class of optimizers stated in the next subsection.…”
Section: State Of the Artmentioning
confidence: 99%
See 1 more Smart Citation
“…Qian, Yu, and Zhou [27] showed that noise can be handled eiciently by combining reevaluation and threshold selection. Akimoto, Astete-Morales, and Teytaud [1] as well as Qian, Yu, Tang, Jin, Yao, and Zhou [26] showed that resampling can essentially eliminate the efect of noise.…”
Section: Introductionmentioning
confidence: 99%