2022
DOI: 10.1613/jair.1.13643
|View full text |Cite
|
Sign up to set email alerts
|

HEBO: An Empirical Study of Assumptions in Bayesian Optimisation

Abstract: In this work we rigorously analyse assumptions inherent to black-box optimisation hyper-parameter tuning tasks. Our results on the Bayesmark benchmark indicate that heteroscedasticity and non-stationarity pose significant challenges for black-box optimisers. Based on these findings, we propose a Heteroscedastic and Evolutionary Bayesian Optimisation solver (HEBO). HEBO performs non-linear input and output warping, admits exact marginal log-likelihood optimisation and is robust to the values of learned paramete… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
32
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 46 publications
(32 citation statements)
references
References 29 publications
0
32
0
Order By: Relevance
“…Moreover, the DL model hyper-parameters such as activation function, number of neurons, and number of layers can be modified to improve the performance. 53 Also, fine-tuning or transferring learning mechanisms can be explored to circumvent these limitations, which positively impacts model performance. It is also interesting to explore Gaussian processes 54 as an alternative model to extract Raman data from the CARS spectrum in future studies.…”
Section: Resultsmentioning
confidence: 99%
“…Moreover, the DL model hyper-parameters such as activation function, number of neurons, and number of layers can be modified to improve the performance. 53 Also, fine-tuning or transferring learning mechanisms can be explored to circumvent these limitations, which positively impacts model performance. It is also interesting to explore Gaussian processes 54 as an alternative model to extract Raman data from the CARS spectrum in future studies.…”
Section: Resultsmentioning
confidence: 99%
“…In one example, one might implement mortality/survival evaluation and cost prediction jointly in such a way that not only the financial element but also OSA patients' quality of life would be considered. Since abundant research has shown the power of Bayesian optimization for improved modeling, its use should be considered for informing prediction projects, although the method may prove time-consuming as the models grow larger and more complicated [78].…”
Section: Discussionmentioning
confidence: 99%
“…The two terms in the expression for the marginal likelihood represent the Occam factor 68 in their preference for selecting models of intermediate capacity. In practical applications, GPs have been primarily employed for their high quality uncertainty estimates across applications including materials modelling, 69 astronomical time series modelling, 70 machine learning hyperparameter tuning, 71,72 and Bayesian optimisation. 35,73…”
Section: Gaussian Processesmentioning
confidence: 99%