2021
DOI: 10.1613/jair.1.12966
|View full text |Cite
|
Sign up to set email alerts
|

Output Space Entropy Search Framework for Multi-Objective Bayesian Optimization

Abstract: We consider the problem of black-box multi-objective optimization (MOO) using expensive function evaluations (also referred to as experiments), where the goal is to approximate the true Pareto set of solutions by minimizing the total resource cost of experiments. For example, in hardware design optimization, we need to find the designs that trade-off performance, energy, and area overhead using expensive computational simulations. The key challenge is to select the sequence of experiments to uncover high-quali… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
69
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 47 publications
(71 citation statements)
references
References 32 publications
2
69
0
Order By: Relevance
“…BBO with Multiple Objectives. Many multi-objective BBO algorithms have been proposed [4,5,25,29,37]. Couckuyt et.…”
Section: Background and Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…BBO with Multiple Objectives. Many multi-objective BBO algorithms have been proposed [4,5,25,29,37]. Couckuyt et.…”
Section: Background and Related Workmentioning
confidence: 99%
“…We use the classic EI [36] for single-objective optimization task. For multi-objective problems, we select EHVI [11] when the number of objectives is less than 5; we use MESMO [4] algorithm for problems with a larger number of objectives, since EHVI's complexity increases exponentially as the number of objectives increases, which not only incurs a large computational overhead but also accumulates floating-point errors. We select the surrogate models in BO depending on the configuration space and the number of trials: If the input space has conditions, such as one parameter must be less than another parameter, or there are over 50 parameters in the input space, or the number of trials exceeds 500, we choose the Probabilistic Random Forest proposed in [27] instead of Gaussian Process (GP) as the surrogate to avoid incompatibility or high computational complexity of GP.…”
Section: Automatic Algorithm Selectionmentioning
confidence: 99%
See 1 more Smart Citation
“…Pareto front entropy search (PFES) [52] is suitable when dealing with decoupled objectives. MESMO [2] builds on the max-value entropy-search criterion [58] and enjoys an asymptotic regret bound. Building on MESMO, two recent works have proposed MF-OSEMO [4] and iMOCA [3], two multi-fidelity based information-theoretic MBO techniques which internally use continuous fidelity Gaussian processes.…”
Section: Related Workmentioning
confidence: 99%
“…In Table 1, we report the results of this experiment, comparing MO algorithms versus the following algorithmic fairness methods: Zafar [60], Adversarial debiasing [63], Fair Empirical Risk Minimization (FERM and FERM pre-processed) [13], constrained BO approach (CBO) for fair models [44], and SMOTE [8]. 2 These results, show that the MO approach is very competitive compared to modeland constraint-specific methods (FERM, Zafar, Adversarial), and also without knowing a priori the threshold of our fairness constraint (as instead needed for CBO).…”
Section: Nas-201 Benchmark Datasetmentioning
confidence: 99%