2021
DOI: 10.1007/s10589-020-00249-0
|View full text |Cite
|
Sign up to set email alerts
|

Stochastic mesh adaptive direct search for blackbox optimization using probabilistic estimates

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
19
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 26 publications
(19 citation statements)
references
References 32 publications
0
19
0
Order By: Relevance
“…In particular, we consider phase retrieval, which seeks to minimize the function, and blind deconvolution, which seeks to minimize Both of these applications are ones in which Common Random Numbers (CRNs) are a reasonable assumption, making two-point gradient estimates relevant. In particular, in (18), the pairs (a i , b i ) can be held constant between two function evaluations, and in (19), triplets (u i , v i , b i ) can be fixed as well.…”
Section: Numerical Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…In particular, we consider phase retrieval, which seeks to minimize the function, and blind deconvolution, which seeks to minimize Both of these applications are ones in which Common Random Numbers (CRNs) are a reasonable assumption, making two-point gradient estimates relevant. In particular, in (18), the pairs (a i , b i ) can be held constant between two function evaluations, and in (19), triplets (u i , v i , b i ) can be fixed as well.…”
Section: Numerical Resultsmentioning
confidence: 99%
“…These are nice examples of how good the zeroth order algorithm works when the stepsize is properly chosen. (18) min…”
Section: Comparison With Methods Using a Stochastic Subgradient Oraclementioning
confidence: 99%
See 1 more Smart Citation
“…This would require improving its performance on unimodal functions with high conditioning, and improving its performance at very early steps, for example by leveraging SMAC for creating the initial DoEs. Another important future work may include the extension of TREGO to the case of noisy observations, following recent results in DFO [47,48] and established BO techniques [49].…”
Section: Discussionmentioning
confidence: 99%
“…These uncertainties arise from random initialization of the weight matrices, bias vectors, dropout, and descent step sizes during backpropagation. We are the first to use a stochastic derivative-free optimization algorithm to solve such a problem via the mesh adaptive direct search, known as StoMADS [21].…”
Section: Predictive Modelingmentioning
confidence: 99%