2016
DOI: 10.1007/s10898-016-0427-3
|View full text |Cite
|
Sign up to set email alerts
|

A Bayesian approach to constrained single- and multi-objective optimization

Abstract: This article addresses the problem of derivative-free (single-or multi-objective) optimization subject to multiple inequality constraints. Both the objective and constraint functions are assumed to be smooth, non-linear and expensive to evaluate. As a consequence, the number of evaluations that can be used to carry out the optimization is very limited, as in complex industrial design optimization problems. The method we propose to overcome this difficulty has its roots in both the Bayesian and the multi-object… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
74
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
4
4
1

Relationship

1
8

Authors

Journals

citations
Cited by 122 publications
(74 citation statements)
references
References 62 publications
0
74
0
Order By: Relevance
“…We next show the hypervolume indicator (HVI) [12] function for the whole set of the Spatial benchmarks as a function of the initial number of warm-up samples (for sake of space we omit the smallest benchmark, BlackScholes). For every benchmark, we show 5 repetitions of the experiments and report variability via a line plot with 80% confidence interval.…”
Section: E Hypervolume Indicatormentioning
confidence: 99%
“…We next show the hypervolume indicator (HVI) [12] function for the whole set of the Spatial benchmarks as a function of the initial number of warm-up samples (for sake of space we omit the smallest benchmark, BlackScholes). For every benchmark, we show 5 repetitions of the experiments and report variability via a line plot with 80% confidence interval.…”
Section: E Hypervolume Indicatormentioning
confidence: 99%
“…18,19,20,21,22 This includes utilities based on feasible improvement, 16,23,24 penalty method, 25 Lagrangian formulation, 26 or information gain. 27,28,29 For instance, the expected improvement with constraint (EIC) 16 is a natural extension of the expected improvement (EI) utility.…”
Section: B Constrained Bayesian Optimizationmentioning
confidence: 99%
“…In addition, ECMO [42] uses a Probability of Feasibility (PoF) criterion to handle constraints. Another recent example is the Bayesian Multi-Objective Optimization (BMOO) algorithm (Féliot et al [43]), that uses a Bayesian expected hypervolume improvement sampling criterion. Hence, the proposed MOCS-RS framework along with promising numerical results provides a significant addition to the class of algorithms for constrained multi-objective black-box optimization.…”
Section: Literature Reviewmentioning
confidence: 99%