2019
DOI: 10.48550/arxiv.1909.03600
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Cost-aware Multi-objective Bayesian optimisation

Abstract: The notion of "expense" in Bayesian optimisation generally refers to the uniformly expensive cost of function evaluations over the whole search space. However, in some scenarios, the cost of evaluation for black-box objective functions is non-uniform since different inputs from search space may incur different costs for function evaluations. We introduce a cost-aware multi-objective Bayesian optimisation with nonuniform evaluation cost over objective functions by defining cost-aware constraints over the search… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(9 citation statements)
references
References 15 publications
0
9
0
Order By: Relevance
“…We compared LaMBO with common baselines GP-UCB [14], GP-EI [15], Max-value entropy search [16], random sampling, and three cost-aware strategies: EIpu [17], CA-MOBO [19], and CArBO [18].…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…We compared LaMBO with common baselines GP-UCB [14], GP-EI [15], Max-value entropy search [16], random sampling, and three cost-aware strategies: EIpu [17], CA-MOBO [19], and CArBO [18].…”
Section: Methodsmentioning
confidence: 99%
“…To empirically evaluate the performance of the proposed method, we applied LaMBO to a number of synthetic datasets used in the literature. When compared with traditional Bayesian optimization (BO) baselines [14,15,16] and cost-aware variants [17,18,19], we find that our method outperforms these methods in terms of the trade-off between deviation from global optimum and the cumulative cost. We next apply LaMBO to a problem arising in neuroscience where the goal is to produce a 3D segmentation of brain structure from high-resolution imaging data [20,10].…”
Section: Introductionmentioning
confidence: 95%
See 1 more Smart Citation
“…Swersky et al [2013] introduce a cost-constrained, multi-task variant of entropy search to speed-up optimization of logistic regression and latent Dirichlet allocation. Cost information is input as a set of cost preferences (e.g., a parameter x 1 is more expensive than a parameter x 2 ) by Abdolshah et al [2019], who develop a multi-objective, constrained BO method that evaluates cheap points before expensive ones, as determined by the cost preferences, to find feasible, low-cost solutions. These methods outperform their black-box counterparts by evaluating cheap proxies or cheap points before selecting expensive evaluations, which is accomplished by leveraging additional cost information inside the optimization routine.…”
Section: Related Workmentioning
confidence: 99%
“…However, we can extend this to a vectorvalued constraint. For example, in materials design, there might be a finite amount of each constituent component, each with its own budget [Abdolshah et al, 2019].…”
Section: Bo As a Constrained Markov Decision Processmentioning
confidence: 99%