2020
DOI: 10.1007/s10994-020-05899-z
|View full text |Cite
|
Sign up to set email alerts
|

High-dimensional Bayesian optimization using low-dimensional feature spaces

Abstract: Bayesian optimization (BO) is a powerful approach for seeking the global optimum of expensive black-box functions and has proven successful for fine tuning hyper-parameters of machine learning models. However, BO is practically limited to optimizing 10–20 parameters. To scale BO to high dimensions, we usually make structural assumptions on the decomposition of the objective and/or exploit the intrinsic lower dimensionality of the problem, e.g. by using linear projections. We could achieve a higher compression … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
41
2

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 99 publications
(43 citation statements)
references
References 27 publications
0
41
2
Order By: Relevance
“…Furthermore, in the GPSG approach, additional or alternative level 0 can be easily incorporated. Possible extensions to our approach include combination with methods to adaptively reduce the input space for constrained optimization problems 48 , or other emulators may be chosen depending on the application. For example, homoscedastic GPs, which are faster than the heteroscedastic approach presented here, may be sufficient for many applications (but not for our IBM in which heteroscedastic was required due to the stochastic nature of the model).…”
Section: Discussionmentioning
confidence: 99%
“…Furthermore, in the GPSG approach, additional or alternative level 0 can be easily incorporated. Possible extensions to our approach include combination with methods to adaptively reduce the input space for constrained optimization problems 48 , or other emulators may be chosen depending on the application. For example, homoscedastic GPs, which are faster than the heteroscedastic approach presented here, may be sufficient for many applications (but not for our IBM in which heteroscedastic was required due to the stochastic nature of the model).…”
Section: Discussionmentioning
confidence: 99%
“…Concretely, the performance of BO may suffer when the dimensionality of the data exceeds 10 to 20 dimensions. 32 Selection of appropriate ML techniques depends on the application, and future work may require that alternative techniques be adopted.…”
Section: Discussionmentioning
confidence: 99%
“…Furthermore, in the GPSG approach, additional or alternative level 0 can be easily incorporated. Possible extensions to our approach include combination with methods to adaptively reduce the input space for constrained optimization problems ( 42 ), or other emulators may be chosen depending on the application. For example, homoscedastic GPs, which are faster than the heteroscedastic approach presented here, may be sufficient for many applications (but not for our IBM in which heteroscedastic was required due to the stochastic nature of the model).…”
Section: Discussionmentioning
confidence: 99%