Proceedings of the 15th ACM/SIGEVO Conference on Foundations of Genetic Algorithms 2019
DOI: 10.1145/3299904.3340304
|View full text |Cite
|
Sign up to set email alerts
|

An exponential lower bound for the runtime of the compact genetic algorithm on jump functions

Abstract: In the first runtime analysis of an estimation-of-distribution algorithm (EDA) on the multi-modal jump function class, Hasenöhrl and Sutton (GECCO 2018) proved that the runtime of the compact genetic algorithm with suitable parameter choice on jump functions with high probability is at most polynomial (in the dimension) if the jump size is at most logarithmic (in the dimension), and is at most exponential in the jump size if the jump size is super-logarithmic. The exponential runtime guarantee was achieved wit… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
15
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3
1

Relationship

4
4

Authors

Journals

citations
Cited by 20 publications
(15 citation statements)
references
References 42 publications
0
15
0
Order By: Relevance
“…While rigorous runtime analyses provide deep insights into the performance of randomised search heuristics, it is highly challenging even for simple algorithms on toy functions. Most current runtime results merely concern univariate EDAs on functions like OneMax [32,51,36,53,40], LeadingOnes [15,22,37,53,38], BinVal [52,37] and Jump [26,11,12], hoping that this provides valuable insights into the development of new techniques for analysing multivariate variants of EDAs and the behaviour of such algorithms on easy parts of more complex problem spaces [13]. There are two main reasons accounted for this.…”
mentioning
confidence: 99%
“…While rigorous runtime analyses provide deep insights into the performance of randomised search heuristics, it is highly challenging even for simple algorithms on toy functions. Most current runtime results merely concern univariate EDAs on functions like OneMax [32,51,36,53,40], LeadingOnes [15,22,37,53,38], BinVal [52,37] and Jump [26,11,12], hoping that this provides valuable insights into the development of new techniques for analysing multivariate variants of EDAs and the behaviour of such algorithms on easy parts of more complex problem spaces [13]. There are two main reasons accounted for this.…”
mentioning
confidence: 99%
“…We refer to Lengler (2020, Section 2.4.3) for more details. Another approach to negative drift was used in Antipov et al (2019) and Doerr (2019bDoerr ( , 2020a. There the original process was transformed suitably (via an exponential function), but in a way that the drift of the new process still is negative or at most a small constant.…”
Section: Related Workmentioning
confidence: 99%
“…While the basic approach is simple and natural, the non-trivial part is finding a rescaling function д which both gives at most a slow progress towards the target and gives a large difference д(b) − д(a). The rescalings used in [5] and [17] were both of exponential type, that is, д was roughly speaking an exponential function. By construction, they only led to lower bounds exponential in b − a, and in both cases the lower bound was not tight (apart from being exponential).…”
Section: Negative Driftmentioning
confidence: 99%
“…Asymptotically better runtimes can be achieved when using crossover, though this is not as easy as one might expect. Since these results and runtime analyses for even more distant algorithms are less relevant for this work, for reasons of space we direct the reader to the original works [4, 9-12, 19, 30, 34, 39, 46, 52, 56] or the more detailed overview in [18,Section 2.3].…”
Section: The Jump Function Classmentioning
confidence: 99%