2010
DOI: 10.1016/j.ejor.2010.07.005
|View full text |Cite
|
Sign up to set email alerts
|

Convergence guarantees for generalized adaptive stochastic search methods for continuous global optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2015
2015
2025
2025

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 14 publications
(6 citation statements)
references
References 13 publications
0
6
0
Order By: Relevance
“…Currently, a novel version of the q -G method, which is able to guarantee the convergence of the algorithm to the global minimum in a probabilistic sense, is under development. This version is based on the generalized adaptive random search (GARS) framework for deterministic functions (Regis 2010 ). In addition, gains in the performance of the q -G method are expected with the implementation of several improvements, such as inclusion of side, linear and nonlinear restrictions, development of better step selection strategies and others.…”
Section: Discussionmentioning
confidence: 99%
“…Currently, a novel version of the q -G method, which is able to guarantee the convergence of the algorithm to the global minimum in a probabilistic sense, is under development. This version is based on the generalized adaptive random search (GARS) framework for deterministic functions (Regis 2010 ). In addition, gains in the performance of the q -G method are expected with the implementation of several improvements, such as inclusion of side, linear and nonlinear restrictions, development of better step selection strategies and others.…”
Section: Discussionmentioning
confidence: 99%
“…Inspired by [21], the next theorem deals with the case where f has a unique global minimiser x * over D. In this situation, the sequence of best solutions {X k } k∈N converges to x * almost surely. Theorem 6.2.…”
Section: The Convergence Of the P-lbfgsb Algorithmmentioning
confidence: 99%
“…2.1]. Due to stochastic nature of the algorithm, the iterates are treated as random vectors whose realizations are in D, following the ideas of [35]. The algorithm is general since the iterates are given randomly by any probability distribution.…”
Section: General Filter-based Stochastic Algorithmmentioning
confidence: 99%
“…Now we analyse the convergence of Algorithm 1 in the probabilistic sense following the ideas of [35]. We say that an algorithm converges to the global minimum of f on D in probability, or almost surely, if the sequence (f (X * k )) converges to f * in probability, or almost surely.…”
Section: 1]mentioning
confidence: 99%
See 1 more Smart Citation