2015
DOI: 10.1007/s00453-015-0072-0
|View full text |Cite
|
Sign up to set email alerts
|

Robustness of Populations in Stochastic Environments

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

5
82
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 65 publications
(87 citation statements)
references
References 18 publications
5
82
0
Order By: Relevance
“…In [8,9,11], the inefficiency of (1 + 1) EA has been rigorously demonstrated for noisy and dynamic optimisation of OneMax and LeadingOnes. The reason behind this inefficiency is that in such environments, it is difficult for the algorithm to compare the quality of solutions, e. g., often it will choose the wrong candidate.…”
Section: Optimisation Under Incomplete Informationmentioning
confidence: 99%
See 1 more Smart Citation
“…In [8,9,11], the inefficiency of (1 + 1) EA has been rigorously demonstrated for noisy and dynamic optimisation of OneMax and LeadingOnes. The reason behind this inefficiency is that in such environments, it is difficult for the algorithm to compare the quality of solutions, e. g., often it will choose the wrong candidate.…”
Section: Optimisation Under Incomplete Informationmentioning
confidence: 99%
“…The reason behind this inefficiency is that in such environments, it is difficult for the algorithm to compare the quality of solutions, e. g., often it will choose the wrong candidate. On the other hand, such effects could be reduced by having a population, for example the model of infinite population in [27] or finite elitist populations in [11].…”
Section: Optimisation Under Incomplete Informationmentioning
confidence: 99%
“…1. We have further shown that using the same algorithm we can also solve onemax in the presence of noise of variance n. Gießen and Kötzing had previously shown in [5] that a (µ + 1)-EA was able to solve onemax in the presence of noise of variance O(1). However, for an mutation-based search algorithm the variation in fitness of a child population will be dwarfed by our noise.…”
Section: Resultsmentioning
confidence: 61%
“…is bit-wise noise with parameter q. For LeadingOnes they improve results from [15], showing that the (1+1) EA runs in polynomial expected time if p = O((log n)/n 2 ) and that it runs in superpolynomial time if p = ω((log n)/n). This holds for one-bit noise with probability p, the (p, 1/n) model and bit-wise noise with probability p/n (see Table 1).…”
Section: Settingmentioning
confidence: 83%
“…Gießen and Kötzing [15] studied a more general class of algorithms, including the (1+1) EA, (1+λ) EA, and (µ+1) EA on prior noise and posterior noise, where posterior noise means that noise is added to the itness value. They presented an elegant approach that gives results in both noise models.…”
Section: Introductionmentioning
confidence: 99%