2019
DOI: 10.1007/s10898-019-00839-1
|View full text |Cite
|
Sign up to set email alerts
|

On the choice of the low-dimensional domain for global optimization via random embeddings

Abstract: The challenge of taking many variables into account in optimization problems may be overcome under the hypothesis of low effective dimensionality. Then, the search of solutions can be reduced to the random embedding of a low dimensional space into the original one, resulting in a more manageable optimization problem. Specifically, in the case of time consuming black-box functions and when the budget of evaluations is severely limited, global optimization with random embeddings appears as a sound alternative to… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
38
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 42 publications
(38 citation statements)
references
References 38 publications
0
38
0
Order By: Relevance
“…In this catenoid example, the additive GP and the GP in the space of (all) 7 principal components achieve comparable results, both in terms of best value, and of function evaluations to attain the targets. Indeed, the true dimension (7) is relatively low, and we have noticed that the 5, 6 or even 7 first eigenshapes often got classified as active for the additive GP.…”
Section: Methodsmentioning
confidence: 98%
See 2 more Smart Citations
“…In this catenoid example, the additive GP and the GP in the space of (all) 7 principal components achieve comparable results, both in terms of best value, and of function evaluations to attain the targets. Indeed, the true dimension (7) is relatively low, and we have noticed that the 5, 6 or even 7 first eigenshapes often got classified as active for the additive GP.…”
Section: Methodsmentioning
confidence: 98%
“…These results indicate a better performance of AddGP(α α α a + α α α a ) which benefits from the prioritization of the most influential eigenshapes in the additive model and, at the same time, accounts for all the 7 eigenshapes. Modeling in the space of the full α α α's (GP(α α α 1:7 )) performs fairly well too because the low true dimensionality (7). Despite its lower dimensionality, GP(α α α 1:4 ) does not work well.…”
Section: Catenoid Shape (Example 5)mentioning
confidence: 99%
See 1 more Smart Citation
“…The embedding dimension is specified a priori, i.e., no mechanism is provided to learn the embedding dimension. Recent work has investigated approaches to automate the selection of the embedding dimension [41].…”
Section: Projection Based Methodsmentioning
confidence: 99%
“…Wang et al 12 project the input producing the maximum of the acquisition function to the closest point on the boundary of the search space if it is outside it. Binois et al 26 and Binois et al 27 use projections to non-closest point to reduce the over-exploration of the boundaries of the search space.…”
Section: Related Workmentioning
confidence: 99%