A new variant of pure random search (PRS) for function optimization is introduced. The basic finite-descent accelerated random search (ARS) algorithm is simple: the search is confined to shrinking neighborhoods of a previous record-generating value, with the search neighborhood reinitialized to the entire space when a new record is found. Local maxima are avoided by including an automatic restart feature which reinitializes the search neighborhood after some number of shrink steps have been performed.One goal of this article is to provide rigorous mathematical comparisons of ARS to PRS. It is shown that the sequence produced by the ARS process converges, with probability one, to the maximum of a continuous objective function faster than that of the PRS process by adjustably large multiples of the time step (Theorem 1). Regarding an infinite-descent (no automatic restart) version of ARS, it is shown that if the objective function satisfies a local nonflatness condition, then the right tails of the distributions of inter-record times are exponentially smaller than those of PRS (Theorem 3).Performance comparisons between ARS, PRS, and three quasi-Newton-type optimization routines are reported in attempting to find extrema of (i) each of a small collection of standard test functions of two variables, and (ii) d-dimensional polynomials with random roots. Also reported is a three-way performance comparison between ARS, PRS, and a simulated annealing algorithm in attempting to solve traveling salesman problems.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.