AIAA Scitech 2020 Forum 2020
DOI: 10.2514/6.2020-2127
|View full text |Cite|
|
Sign up to set email alerts
|

Efficient Multi-Information Source Multiobjective Bayesian Optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 8 publications
(3 citation statements)
references
References 40 publications
0
3
0
Order By: Relevance
“…We adopt the methodology presented in refs. 17,73 to conduct Bayesian optimization of multiobjective functions in multi-fidelity settings. For a detailed explanation of the calculation of the EHVI, we refer readers to ref.…”
Section: Multi-objective Optimizationmentioning
confidence: 99%
“…We adopt the methodology presented in refs. 17,73 to conduct Bayesian optimization of multiobjective functions in multi-fidelity settings. For a detailed explanation of the calculation of the EHVI, we refer readers to ref.…”
Section: Multi-objective Optimizationmentioning
confidence: 99%
“…Contrary to other recent approaches which propose to query the ground-truth on a regular basis, such as in (Khatamsaz et al, 2020), at each iteration FanG-HPO adaptively chooses among all the sources, including the ground-truth. However, just to ensure a sufficient quality of the approximation provided by the AGPs, before solving (12) FanG-HPO checks if the number of augmenting observations coming from anyone of the cheap sources is larger than those from the ground-truth: in that case s = 1 is selected, instead of solving (12).…”
Section: Deriving the Next Querymentioning
confidence: 99%
“…Thanks to its sample-efficiency, BO is the core component of most of the current Automated Machine Learning (AutoML) (Hutter et al, 2019;He et al, 2021) solutions, both open-source and commercial. BO has been recently extended to also deal with multiple objectives (Hernández-Lobato et al, 2016;Paria et al, 2020), as well as multiple information sources with different computational cost can be accessed (Ghoreishi & Allaire, 2019;Belakaria et al, 2020a;Candelieri et al, 2021;Candelieri & Archetti, 2021b;Khatamsaz et al, 2020). A special case is when sources can be organized hierarchically depending on their quality of approximation (aka fidelity), leading to the so-called multifidelity optimization, originally proposed in (Kennedy & O'Hagan, 2000).…”
Section: Introduction 1rationale and Motivationsmentioning
confidence: 99%