Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII 2015
DOI: 10.1145/2725494.2725502
|View full text |Cite
|
Sign up to set email alerts
|

(1+1) EA on Generalized Dynamic OneMax

Abstract: Evolutionary algorithms (EAs) perform well in settings involving uncertainty, including settings with stochastic or dynamic fitness functions. In this paper, we analyze the (1+1) EA on dynamically changing OneMax, as introduced by Droste (2003). We re-prove the known results on first hitting times using the modern tool of drift analysis. We extend these results to search spaces which allow for more than two values per dimension.Furthermore, we make an anytime analysis as suggested by Jansen and Zarges (2014), … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
32
0

Year Published

2015
2015
2019
2019

Publication Types

Select...
4
4
1

Relationship

2
7

Authors

Journals

citations
Cited by 51 publications
(37 citation statements)
references
References 24 publications
1
32
0
Order By: Relevance
“…The ability of a very simple EA without a population, the (1 + 1) EA, to track a target bitstring in a OneMax-like function is analysed in [3,18]. The analysis has recently been extended from bitstrings to larger alphabets [8]. The influence of magnitude and frequency of changes on the efficiency of the (1 + 1) EA in optimising a specifically designed function was investigated in [17], showing that some dynamic optimisation problems become easier with higher frequency of change.…”
Section: Introductionmentioning
confidence: 99%
“…The ability of a very simple EA without a population, the (1 + 1) EA, to track a target bitstring in a OneMax-like function is analysed in [3,18]. The analysis has recently been extended from bitstrings to larger alphabets [8]. The influence of magnitude and frequency of changes on the efficiency of the (1 + 1) EA in optimising a specifically designed function was investigated in [17], showing that some dynamic optimisation problems become easier with higher frequency of change.…”
Section: Introductionmentioning
confidence: 99%
“…The runtime analysis of dynamic evolutionary optimization is still in its very infancy, with only few results available (e. g. [9,29,31,32,38,47,48]).…”
Section: How Diversity Benefits Dynamic Optimisationmentioning
confidence: 99%
“…To the best of our knowledge Branke and Wang (2003) are the only ones to consider the scenario of change during a generation and provide a detailed analysis of a simple (1, 2) evolution strategy, an algorithm that also performs only 2 function evaluations per generation, for this case. Other articles, among them work by Jansen and Schellbach (2005), Kötzing et al (2015), Lissovoi and Witt (2013), Lissovoi and Witt (2014), Oliveto and Zarges (2013), and Oliveto and Zarges (2014), larger population size or offspring population size are taken into account but it is (sometimes implicitly) assumed that the fitness function does not change within a generation. This assumption becomes critical when the effects of the choice of the population size and of the offspring population size are studied.…”
Section: State Of the Art In The Theoretical Analysis Of Dynamic Optimentioning
confidence: 99%