2023
DOI: 10.21203/rs.3.rs-3118480/v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Decreased Max-value Entropy Search for Multi-fidelity Bayesian Optimization

Abstract: Bayesian Optimization is a widely applied efficient framework for updating the surrogate model sequentially. To improve the efficiency, multi-fidelity Bayesian Optimization is developed to combine the information of samples in different fidelity levels. However, the multi-fidelity levels brings the challenge for sequential sampling. In multi-fidelity Bayesian Optimization, the sampling strategy is applied to determine not only the sample location but also the sample fidelity level for updating the model. To ba… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 20 publications
0
1
0
Order By: Relevance
“…Our acquisition function can be viewed as a version of BAX [32], which aims to greedily gain information about a computed function property via running an algorithm on posterior samples; here, the associated algorithm is an emittance optimization procedure, and the associated function property (computed by the algorithm) is the emittance minimizer over the control variables, x * ∈ X . Our acquisition function is discussed in more detail in appendix D. Timing results for this algorithm are given in appendix E. We note that the multipoint query optimization setting we consider in this paper-and the information-based acquisition function that we develop for this setting-are distinct from the settings (and corresponding methods) of several recent developments in BO, including standard optimization with entropy search methods [43][44][45], multi-objective optimization [42,46], and multi-fidelity optimization [47,48]. In particular, multi-fidelity Bayesian optimization (MFBO) settings are related in that they also involve an auxiliary input space (i.e.…”
Section: Multipoint-bax For Efficient Emittance Optimizationmentioning
confidence: 99%
“…Our acquisition function can be viewed as a version of BAX [32], which aims to greedily gain information about a computed function property via running an algorithm on posterior samples; here, the associated algorithm is an emittance optimization procedure, and the associated function property (computed by the algorithm) is the emittance minimizer over the control variables, x * ∈ X . Our acquisition function is discussed in more detail in appendix D. Timing results for this algorithm are given in appendix E. We note that the multipoint query optimization setting we consider in this paper-and the information-based acquisition function that we develop for this setting-are distinct from the settings (and corresponding methods) of several recent developments in BO, including standard optimization with entropy search methods [43][44][45], multi-objective optimization [42,46], and multi-fidelity optimization [47,48]. In particular, multi-fidelity Bayesian optimization (MFBO) settings are related in that they also involve an auxiliary input space (i.e.…”
Section: Multipoint-bax For Efficient Emittance Optimizationmentioning
confidence: 99%
“…A large body of research identifies the complete Pareto front using entropy-based acquisition functions. For example, MESMO (Wang & Jegelka, 2017;Belakaria et al, 2019), MESMOC (Belakaria et al, 2020), and PESMO (Hernández-Lobato et al, 2016) determine the Pareto front by reducing posterior entropy. SMSego uses the maximum hypervolume improvement acquisition function to choose the next sample (Ponweiser et al, 2008).…”
Section: Related Workmentioning
confidence: 99%
“…This is used to score the utility of evaluating a candidate design x ∈ X based on the statistical model. Some popular acquisition functions in the SOBO litera-ture include expected improvement (EI) (Emmerich & Klinkenberg, 2008), upper confidence bound (UCB) (Srinivas et al, 2012), predictive entropy search (PES) (Hernández-Lobato et al, 2014), and max-value entropy search (MES) (Wang & Jegelka, 2017).…”
Section: Background and Definitionsmentioning
confidence: 99%
“…Furthermore, recent entropy-based acquisition functions have been proposed: the basic idea is determining the next test point based on the expected information gained about the location or the value of the optimum of a function; see [35] for details. As entropy search may involve nested Monte Carlo algorithms, the added complexity makes these options not popular in actual applications, given that the other above enumerated acquisition functions do yield good performance in most cases, so entropy-based acquisition functions will not be discussed further in this work.…”
Section: Gaussian Processes and Bayesian Optimizationmentioning
confidence: 99%