Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII 2015
DOI: 10.1145/2725494.2725500
|View full text |Cite
|
Sign up to set email alerts
|

Evolution Strategies with Additive Noise

Abstract: We consider the problem of optimizing functions corrupted with additive noise. It is known that Evolutionary Algorithms can reach a Simple Regret O(1/ √ n) within logarithmic factors, when n is the number of function evaluations. Here, Simple Regret at evaluation n is the difference between the evaluation of the function at the current recommendation point of the algorithm and at the real optimum. We show mathematically that this bound is tight, for any family of functions that includes sphere functions, at le… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
14
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 17 publications
(15 citation statements)
references
References 23 publications
1
14
0
Order By: Relevance
“…This rate s(SR) = −1 has been proved tight in [14]. Hence, Shamir and Fabian algorithm are faster than ES's, which cannot do better than s(SR) = − 1 2 , at least under their usual form [4].…”
Section: Stochastic Gradient Descentmentioning
confidence: 94%
See 2 more Smart Citations
“…This rate s(SR) = −1 has been proved tight in [14]. Hence, Shamir and Fabian algorithm are faster than ES's, which cannot do better than s(SR) = − 1 2 , at least under their usual form [4].…”
Section: Stochastic Gradient Descentmentioning
confidence: 94%
“…The convergence rate is s(SR) = K for some K < 0 under assumptions about the convergence in ES in the noise-free case. Moreover [4] shows that ES, under general conditions, must exhibit K > − 1 2 . There is no formal proof of an upper bound that can theoretically ensure a value or a range for s(SR).…”
Section: Simple Regret For Es With Resamplingsmentioning
confidence: 95%
See 1 more Smart Citation
“…It is difficult to make ES noise-resilient. The lower bound of the convergence-rate in terms of total number of function evaluations T has been shown to be Ω 1/ √ T [2]. The lower bound relies on the property that ESs adapt the step-size to be O (∥x − x * ∥/d).…”
Section: Introductionmentioning
confidence: 99%
“…The lower bound relies on the property that ESs adapt the step-size to be O (∥x − x * ∥/d). It has been shown that ensuring a larger step-size allows for O(1/T ) convergence on a quadratic function, the theoretical optimum [2,8].…”
Section: Introductionmentioning
confidence: 99%