Proceedings of the 11th Annual Conference on Genetic and Evolutionary Computation 2009
DOI: 10.1145/1569901.1569976
|View full text |Cite
|
Sign up to set email alerts
|

Efficient natural evolution strategies

Abstract: Efficient Natural Evolution Strategies (eNES) is a novel alternative to conventional evolutionary algorithms, using the natural gradient to adapt the mutation distribution. Unlike previous methods based on natural gradients, eNES uses a fast algorithm to calculate the inverse of the exact Fisher information matrix, thus increasing both robustness and performance of its evolution gradient estimation, even in higher dimensions. Additional novel aspects of eNES include optimal fitness baselines and importance mix… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
103
0
1

Year Published

2010
2010
2024
2024

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 83 publications
(104 citation statements)
references
References 7 publications
0
103
0
1
Order By: Relevance
“…A more general approach for partially observable environments directly evolves programs for RNN with internal states (no need for the Markovian assumption), by applying evolutionary algorithms [30,50,83] to RNN weight matrices [26,85,89,105]. Recent work brought progress through a focus on reducing search spaces by co-evolving the comparatively small weight vectors of individual neurons and synapses [21], by Natural Gradient-based Stochastic Search Strategies [19,53,92,93,102,103], and by reducing search spaces through weight matrix compression [36,61]. Our RL RNN now outperform many previous methods on benchmarks [21], creating memories of important events and solving numerous tasks unsolvable by classical RL methods.…”
Section: Recurrent / Deep Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…A more general approach for partially observable environments directly evolves programs for RNN with internal states (no need for the Markovian assumption), by applying evolutionary algorithms [30,50,83] to RNN weight matrices [26,85,89,105]. Recent work brought progress through a focus on reducing search spaces by co-evolving the comparatively small weight vectors of individual neurons and synapses [21], by Natural Gradient-based Stochastic Search Strategies [19,53,92,93,102,103], and by reducing search spaces through weight matrix compression [36,61]. Our RL RNN now outperform many previous methods on benchmarks [21], creating memories of important events and solving numerous tasks unsolvable by classical RL methods.…”
Section: Recurrent / Deep Neural Networkmentioning
confidence: 99%
“…Our RL RNN now outperform many previous methods on benchmarks [21], creating memories of important events and solving numerous tasks unsolvable by classical RL methods. Several best paper awards resulted from this research, e.g., [18,92].…”
Section: Recurrent / Deep Neural Networkmentioning
confidence: 99%
“…Natural evolution strategies (NES) [3,[8][9][10][11] are a class of evolutionary algorithms for real-valued optimization. They maintain a Gaussian search distribution with fully adaptive covariance matrix.…”
Section: Natural Evolution Strategiesmentioning
confidence: 99%
“…The recently introduced family of natural evolution strategies (NES [3,[8][9][10][11]), consists in an optimization method that follows a sampled natural gradient of the expected fitness, and as such, provides a more principled alternative to CMA-ES. In this paper we combine the well-founded framework of NES with the proven approach of tackling MOO using evolution strategies.…”
Section: Introductionmentioning
confidence: 99%
“…Modern, flexible, and easy to implement natural evolution strategies [31,28,27,11,10,25] are used as search algorithms for kernel representations of continuous functions of a single variable (time). The efficiency is demonstrated on the problem of finding kinematic plan representations for redundant robot arms movement.…”
Section: Introductionmentioning
confidence: 99%