Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence 2021
DOI: 10.24963/ijcai.2021/296
|View full text |Cite
|
Sign up to set email alerts
|

DEHB: Evolutionary Hyberband for Scalable, Robust and Efficient Hyperparameter Optimization

Abstract: Modern machine learning algorithms crucially rely on several design decisions to achieve strong performance, making the problem of Hyperparameter Optimization (HPO) more important than ever. Here, we combine the advantages of the popular bandit-based HPO method Hyperband (HB) and the evolutionary search approach of Differential Evolution (DE) to yield a new HPO method which we call DEHB. Comprehensive results on a very broad range of HPO problems, as well as a wide range of tabular benchmarks from neural archi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
29
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 26 publications
(30 citation statements)
references
References 15 publications
1
29
0
Order By: Relevance
“…As mentioned earlier, a major source of sub-optimality for the current gray-box techniques (Li et al, 2017;Falkner et al, 2018;Awad et al, 2021) is the poor rank correlation of performances at low and high budgets, which is endemic to the successive halving mechanism. In essence, prior methods mistakenly discard good hyper-parameter configurations by myopically relying only on the early performance after a few epochs, following a fixed fidelity scheme.…”
Section: Motivationmentioning
confidence: 99%
See 4 more Smart Citations
“…As mentioned earlier, a major source of sub-optimality for the current gray-box techniques (Li et al, 2017;Falkner et al, 2018;Awad et al, 2021) is the poor rank correlation of performances at low and high budgets, which is endemic to the successive halving mechanism. In essence, prior methods mistakenly discard good hyper-parameter configurations by myopically relying only on the early performance after a few epochs, following a fixed fidelity scheme.…”
Section: Motivationmentioning
confidence: 99%
“…BOHB (Falkner et al, 2018) uses TPE (Bergstra et al, 2011) and builds a surrogate model for every fidelity adhering to a fixed-fidelity selection scheme. DEHB (Awad et al, 2021) samples candidates using differential evolution which handles discrete and large hyperparameter search spaces better than BOHB.…”
Section: Related Work On Gray-box Hpomentioning
confidence: 99%
See 3 more Smart Citations