2004
DOI: 10.1137/s105262340240063x
|View full text |Cite
|
Sign up to set email alerts
|

On Accelerated Random Search

Abstract: A new variant of pure random search (PRS) for function optimization is introduced. The basic finite-descent accelerated random search (ARS) algorithm is simple: the search is confined to shrinking neighborhoods of a previous record-generating value, with the search neighborhood reinitialized to the entire space when a new record is found. Local maxima are avoided by including an automatic restart feature which reinitializes the search neighborhood after some number of shrink steps have been performed.One goal … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
65
0

Year Published

2006
2006
2022
2022

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 72 publications
(65 citation statements)
references
References 30 publications
0
65
0
Order By: Relevance
“…This method was analysed in [3] and it was shown that this algorithm outperforms the simple PRS. As follows from below the finite descent version of ARS (the case ρ > 0) converges lazily towards A .…”
Section: Accelerated Random Searchmentioning
confidence: 99%
See 1 more Smart Citation
“…This method was analysed in [3] and it was shown that this algorithm outperforms the simple PRS. As follows from below the finite descent version of ARS (the case ρ > 0) converges lazily towards A .…”
Section: Accelerated Random Searchmentioning
confidence: 99%
“…This paper shows when an optimization method is lazy and the presented general results cover, in particular, the class of simulated annealing algorithms and monotone random search. To provide an application example from the class of methods which are based on parameters' self-adaptation it is shown that the finite descent Accelerated Random Search [3], converges lazily. As it is discussed further, the undesired lazy convergence property appears to be the property of an optimization method rather than the property of the problem function f .…”
Section: Introductionmentioning
confidence: 99%
“…Marwala [34][35][36] successfully applied three separate genetic algorithms to minimize the distance between the measured data and the finite-element predicted data. Touat et al 37 proposed an accelerated random search algorithm for model updating. The Accelerated Random search (ARS) algorithm 38 has been developed to accelerate the stochastic search process for mathematical optimisation problems.…”
Section: Introductionmentioning
confidence: 99%
“…Accelerated random search (ars) [3], and its offshoot oscars [14], are methods for bound constrained global optimization which use the objective function only to determine which of a pair of points is better than the other. Hence they can easily be modified to address other problems such as (1) when an alternative way of doing pairwise comparisons (e.g.…”
Section: Introductionmentioning
confidence: 99%