2017
DOI: 10.1002/nme.5632
|View full text |Cite
|
Sign up to set email alerts
|

Probabilistic nonconvex constrained optimization with fixed number of function evaluations

Abstract: Summary A methodology is proposed for the efficient solution of probabilistic nonconvex constrained optimization problems with uncertain. Statistical properties of the underlying stochastic generator are characterized from an initial statistical sample of function evaluations. A diffusion manifold over the initial set of data points is first identified and an associated basis computed. The joint probability density function of this initial set is estimated using a kernel density model and an Itô stochastic dif… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
54
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 27 publications
(54 citation statements)
references
References 37 publications
0
54
0
Order By: Relevance
“…In this paper, the probabilistic learning used for solving the optimization problem under uncertainties refers to an algorithm that is based on the use of the two following methodologies. (i) The first one is the novel methodology that is proposed in [25,88] for solving the CNOPPU, which consists in using -an optimizer that is adapted to the nature of the optimization problem under consideration, -a dataset that is made up of a small number of points generated by the optimizer for which only a small number of expensive evaluations of functions is carried out.…”
Section: Algorithms For Solving Optimization Problems Under Uncertainmentioning
confidence: 99%
See 2 more Smart Citations
“…In this paper, the probabilistic learning used for solving the optimization problem under uncertainties refers to an algorithm that is based on the use of the two following methodologies. (i) The first one is the novel methodology that is proposed in [25,88] for solving the CNOPPU, which consists in using -an optimizer that is adapted to the nature of the optimization problem under consideration, -a dataset that is made up of a small number of points generated by the optimizer for which only a small number of expensive evaluations of functions is carried out.…”
Section: Algorithms For Solving Optimization Problems Under Uncertainmentioning
confidence: 99%
“…(ii) The second ingredient is a smoothing technique [25,88] for estimating J(w 0 ) and c(w 0 ) at any point w 0 in C w , by using only the additional realizations generated from the given dataset. The points w 0 are those that are generated for the optimization algorithm.…”
Section: Probabilistic Learning Algorithm Proposed For Circumvent Thementioning
confidence: 99%
See 1 more Smart Citation
“…The development of mathematical and algorithmic constructs that promote learning with successive optimization steps continues to be a key challenge in that regard. In the framework of this paper, "optimization under uncertainties" is related to the novel methodology that is proposed in [3] • for solving a probabilistic nonconvex constrained optimization (an OUU),…”
Section: Meaning Of "Optimization Under Uncertainties" In the Framewomentioning
confidence: 99%
“…The second ingredient is a smoothing technique that allows for estimating the mathematical expectations in the computation of f (w 0 ) and c(w 0 ) at any point w 0 in C w , by using only the given dataset and the additional samples [3].…”
Section: Ingredientsmentioning
confidence: 99%