2019
DOI: 10.1007/978-3-030-10928-8_37
|View full text |Cite
|
Sign up to set email alerts
|

Exploration Enhanced Expected Improvement for Bayesian Optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
26
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
3
1

Relationship

1
9

Authors

Journals

citations
Cited by 36 publications
(26 citation statements)
references
References 8 publications
0
26
0
Order By: Relevance
“…EI's deteriorating performance at higher noise levels may indicate that the surrogate model is poorer at representing the noisy function, leading to insufficient exploration. This is in contrast to the more exploratory UCB, which over-explores in the noise-free case [3]. It would be interesting to investigate BO strategies for optimisation in the presence of noise.…”
Section: Resultsmentioning
confidence: 99%
“…EI's deteriorating performance at higher noise levels may indicate that the surrogate model is poorer at representing the noisy function, leading to insufficient exploration. This is in contrast to the more exploratory UCB, which over-explores in the noise-free case [3]. It would be interesting to investigate BO strategies for optimisation in the presence of noise.…”
Section: Resultsmentioning
confidence: 99%
“…We concentrate on CB (LCB in particular) because a simple amendment of it gives very satisfactory results (see Section IV). Berk et al [18] indicate that EI is prone to detecting a local rather than a global optimum. A common heuristic is that EI is not appropriate for noisy problems because when noise is present, the optimal observation may not correspond to the true optimal function value.…”
Section: The Acquisition Functionmentioning
confidence: 99%
“…In contrast, Sequential Model-Based Optimization (SMBO) (Jones et al 1998) takes advantage of the previous search trajectory. Several benchmarks (Hutter et al 2013;Bischl et al 2017;Berk et al 2018) demonstrate the superiority of MBO over grid and random search as well as evolutionary approaches. In the classical approach, Gaussian process regression, also called Kriging, is used as its regression model (Snoek et al 2012).…”
Section: Hyper-parameter Tuning Algorithmsmentioning
confidence: 99%