Proceedings of the Genetic and Evolutionary Computation Conference 2018
DOI: 10.1145/3205455.3205563
|View full text |Cite
|
Sign up to set email alerts
|

A new analysis method for evolutionary optimization of dynamic and noisy objective functions

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

2
15
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 34 publications
(17 citation statements)
references
References 13 publications
2
15
0
Order By: Relevance
“…The goal of OM is maximizing the number of 1s in a solution, while the goal of LO is maximizing the number of continuous 1s from the first bit in a solution. Runtime analysis for the two problems under various noise models [15][16][17][18][19][20] showed that only if the noise level is low, (1+1)-EA can quickly find the optimum. For instance, onebit noise is a frequently-used noise model in theoretical analysis.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The goal of OM is maximizing the number of 1s in a solution, while the goal of LO is maximizing the number of continuous 1s from the first bit in a solution. Runtime analysis for the two problems under various noise models [15][16][17][18][19][20] showed that only if the noise level is low, (1+1)-EA can quickly find the optimum. For instance, onebit noise is a frequently-used noise model in theoretical analysis.…”
Section: Introductionmentioning
confidence: 99%
“…For OM of size n under onebit noise, the expected runtime (ERT) of (1+1)-EA is superpolynomial if p = ω(log n/n). There are also some studies concerning the effectiveness of various strategies to tackle noise, e.g., threshold selection [19,21,22], populations [16,18,20,23,24], and sampling [25][26][27]. For instance, if µ = Θ(log n), the ERT of (µ + 1)-EA optimizing OM under onebit noise is polynomial for any p ∈ [0, 1] (note that p denotes the noise probability).…”
Section: Introductionmentioning
confidence: 99%
“…The studies show that the (1+1)-EA is efficient, i.e., it can find an optimal solution in polynomial running time, only under low noise levels. Later studies mainly proved the robustness of different strategies against noise, including using populations [3,9,10,11,12], sampling [4,13] and threshold selection [14]. There is also a sequence of papers analyzing the running time of the compact genetic algorithm [15] and a simple ant colony optimization algorithm [16,17,18,19] solving noisy problems, including OneMax as well as the combinatorial optimization problem, single destination shortest paths.…”
Section: Introductionmentioning
confidence: 99%
“…The presence of noise further increases the randomness of optimization, and only a few results on noisy evolutionary optimization have been reported. The (1+1)-EA algorithm, which uses population size 1 and mutation only, was first studied on the OneMax and LeadingOnes problems under various noise models [6,10,14,19,27,30]. OneMax and LeadingOnes are two benchmark pseudo-Boolean problems, widely used in theoretical analyses of EAs, whose goals are to maximize the number of 1-bits of a solution and the number of consecutive 1-bits counting from the left of a solution, respectively.…”
Section: Introductionmentioning
confidence: 99%
“…Later studies mainly investigated the robustness of different strategies against noise, including using populations [9,10,19,24,30], sampling [25,26,28] and threshold selection [27]. For example, the (µ + 1)-EA with µ = Θ(log n) can solve OneMax in polynomial time even if the probability of one-bit noise reaches 1.…”
Section: Introductionmentioning
confidence: 99%