2020
DOI: 10.1162/evco_a_00258
|View full text |Cite
|
Sign up to set email alerts
|

Simple Hyper-Heuristics Control the Neighbourhood Size of Randomised Local Search Optimally for LeadingOnes

Abstract: Selection hyper-heuristics (HHs) are randomised search methodologies which choose and execute heuristics during the optimisation process from a set of low-level heuristics. A machine learning mechanism is generally used to decide which low-level heuristic should be applied in each decision step. In this paper we analyse whether sophisticated learning mechanisms are always necessary for HHs to perform well. To this end we consider the most simple HHs from the literature and rigorously analyse their performance … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

2
17
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 37 publications
(19 citation statements)
references
References 45 publications
2
17
0
Order By: Relevance
“…All these references consider the optimization of ONEMAX, the problem of maximizing the counting-ones function OM : {0, 1} n → R, x → n i=1 x i . Only few theoretical results analyzing algorithms with adaptive parameters consider different functions, e.g., (Lissovoi et al, 2020;Doerr et al, 2018b,a) (see for a complete list of references). ONEMAX also plays a prominent role in empirical research on parameter control.…”
Section: Introductionmentioning
confidence: 99%
“…All these references consider the optimization of ONEMAX, the problem of maximizing the counting-ones function OM : {0, 1} n → R, x → n i=1 x i . Only few theoretical results analyzing algorithms with adaptive parameters consider different functions, e.g., (Lissovoi et al, 2020;Doerr et al, 2018b,a) (see for a complete list of references). ONEMAX also plays a prominent role in empirical research on parameter control.…”
Section: Introductionmentioning
confidence: 99%
“…, } for > 1 provably outperform static settings [11] of the mutation operator. Self-adjusting schemes are also closely related to hyper-heuristics which, e. g., can dynamically choose between different mutation operators and therefore outperform static settings [25]. Besides the mutation probability, other parameters like the population sizes may be adjusted during the run of an evolutionary algorithm (EA) and analyzed from a runtime perspective [23].…”
Section: Introductionmentioning
confidence: 99%
“…The Generalised Random Gradient (GRG) hyperheuristic was analysed (Lissovoi, Oliveto, and Warwicker 2019b) as an extension of the classical Random Gradient hyper-heuristic (Cowling, Kendall, and Soubeiga 2001). It applies a randomly chosen low-level heuristic for a learning period of τ iterations.…”
Section: Selection Hyper-heuristicsmentioning
confidence: 99%
“…All rights reserved. multimodal benchmark functions (Alanazi and Lehre 2016;Lehre andÖzcan 2013;Alanazi and Lehre 2014;Qian, Tang, and Zhou 2016;Lissovoi, Oliveto, and Warwicker 2019b;2019a;Doerr et al 2018;Doerr, Doerr, and Yang 2016a). For an overview of theoretical results regarding the performance of hyper-heuristics and other parameter control mechanisms, we refer to the recent survey by Doerr and Doerr (2019).…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation