2014
DOI: 10.1007/978-3-662-45523-4_49
|View full text |Cite
|
Sign up to set email alerts
|

Noisy Optimization: Convergence with a Fixed Number of Resamplings

Abstract: It is known that evolution strategies in continuous domains might not converge in the presence of noise [3,14]. It is also known that, under mild assumptions, and using an increasing number of resamplings, one can mitigate the effect of additive noise [4] and recover convergence. We show new sufficient conditions for the convergence of an evolutionary algorithm with constant number of resamplings; in particular, we get fast rates (log-linear convergence) provided that the variance decreases around the optimum … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2015
2015
2019
2019

Publication Types

Select...
3

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 16 publications
0
2
0
Order By: Relevance
“…Such results can be found in [4,5,21,27]. In some cases, the same behavior can be achieved in the noisy case; typically in the case of variance decreasing faster than in the multiplicative model, and if fitness values are averaged over a constant ad-hoc number of resamplings [9].…”
Section: Typical Convergence Behaviourmentioning
confidence: 55%
“…Such results can be found in [4,5,21,27]. In some cases, the same behavior can be achieved in the noisy case; typically in the case of variance decreasing faster than in the multiplicative model, and if fitness values are averaged over a constant ad-hoc number of resamplings [9].…”
Section: Typical Convergence Behaviourmentioning
confidence: 55%
“…, f (x, w r )) and the fitness value used in the comparisons is the average of these evaluations 1 r r i=1 f (x, w i ). In particular, the variance of the noise is divided by r. Several rules have been studied: constant [13], adaptive (polynomial in the inverse of the step-size), polynomial and exponential [5] number of resamplings. A general (µ, λ)-ES is presented in Algorithm 2.…”
Section: Evolution Strategies (Es)mentioning
confidence: 99%