2021
DOI: 10.48550/arxiv.2111.05040
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A survey on high-dimensional Gaussian process modeling with application to Bayesian optimization

Abstract: Extending the efficiency of Bayesian optimization (BO) to larger number of parameters has received a lot of attention over the years. Even more so has Gaussian process regression modeling in such contexts, on which most BO methods are based. A variety of structural assumptions have been tested to tame high dimension, ranging from variable selection and additive decomposition to low dimensional embeddings and beyond. Most of these approaches in turn require modifications of the acquisition function optimization… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 77 publications
(123 reference statements)
0
7
0
Order By: Relevance
“…Regarding the optimization procedure, a natural evolution is the implementation of a multi-objective optimization which considers at the same time both mass and deflection at a given point, for example. This can be done in a Bayesian setting, accounting for high dimensional input parameter space, 68,69 but also other approaches should be considered, such as genetic algorithms enhanced by active subspaces. 70,71 by SMACT Competence Center, and partially funded by European Union Funding for Research and Innovation -Horizon 2020 Program -in the framework of European Research Council Executive Agency: H2020 ERC CoG 2015 AROMA-CFD project 681447 "Advanced Reduced Order Methods with Applications in Computational Fluid Dynamics" P.I.…”
Section: Discussionmentioning
confidence: 99%
“…Regarding the optimization procedure, a natural evolution is the implementation of a multi-objective optimization which considers at the same time both mass and deflection at a given point, for example. This can be done in a Bayesian setting, accounting for high dimensional input parameter space, 68,69 but also other approaches should be considered, such as genetic algorithms enhanced by active subspaces. 70,71 by SMACT Competence Center, and partially funded by European Union Funding for Research and Innovation -Horizon 2020 Program -in the framework of European Research Council Executive Agency: H2020 ERC CoG 2015 AROMA-CFD project 681447 "Advanced Reduced Order Methods with Applications in Computational Fluid Dynamics" P.I.…”
Section: Discussionmentioning
confidence: 99%
“…The limiting factor for scalability is the training of the GP emulator, which scales cubically with the number of training points as a result of having to perform matrix inversion (or pseudo-inversion) of a covariance matrix. A growing number of techniques exist that improve the scalability of GPs, for example, by identifying and removing variables with little or no impact on the output, or by defining new variables based on combinations of the original ones (e.g., Binois & Wycoff, 2021). Alternatively, other machine learning models such as deep neural networks can be used that scale better with input dimensions (e.g., Lan et al, 2021).…”
Section: Summary and Discussionmentioning
confidence: 99%
“…We point the reader to [BW21] for a comprehensive overview of the state-of-the-art in high-dimensional BO.…”
Section: Related Workmentioning
confidence: 99%
“…The maximization task of the acquisition function is also hampered by high dimensionality. As such, BO is often taken only for small-scale problems (typically less than 20 search variables), and it remains an open challenge to scale it up for high-dimensional problems [BW21].…”
Section: Introductionmentioning
confidence: 99%