2007
DOI: 10.1007/s00521-007-0087-9
|View full text |Cite
|
Sign up to set email alerts
|

Using evolution to improve neural network learning: pitfalls and solutions

Abstract: Autonomous neural network systems typically require fast learning and good generalization performance, and there is potentially a trade-off between the two.The use of evolutionary techniques to improve the learning abilities of neural network systems is now widespread. However, there are a range of different evolutionary approaches that could be applied, and no systematic investigation has been carried out to find which work best. In this paper, such an investigation is presented, and it is shown that a range … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

4
43
0

Year Published

2008
2008
2014
2014

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 19 publications
(47 citation statements)
references
References 28 publications
4
43
0
Order By: Relevance
“…One solution is to use an evolutionary algorithm to optimize the key BP learning parameters, such as the random initial weight range [-ρ, ρ] and learning rate η. With a fixed, sufficiently large, number of training epochs for each problem, the evolved learning rate is then able to implement a form of early stopping and avoid over-fitting, and that consistently leads to improved performances [3]. However, such evolutionary approaches tend to be rather computationally intensive, and might be regarded as giving BP an unfair advantage over the ABC.…”
Section: Neural Network Training Using Optimized Bpmentioning
confidence: 99%
See 4 more Smart Citations
“…One solution is to use an evolutionary algorithm to optimize the key BP learning parameters, such as the random initial weight range [-ρ, ρ] and learning rate η. With a fixed, sufficiently large, number of training epochs for each problem, the evolved learning rate is then able to implement a form of early stopping and avoid over-fitting, and that consistently leads to improved performances [3]. However, such evolutionary approaches tend to be rather computationally intensive, and might be regarded as giving BP an unfair advantage over the ABC.…”
Section: Neural Network Training Using Optimized Bpmentioning
confidence: 99%
“…However, such evolutionary approaches tend to be rather computationally intensive, and might be regarded as giving BP an unfair advantage over the ABC. A standard non-evolutionary approach will therefore be studied first, but using information that consistently emerges from evolutionary investigations [3], namely that very small initial weight ranges and very slow learning rates tend to work best, with a standard stopping early approach to set the number of epochs. The details of the experimental set-up and analysis were chosen to provide the closest possible match with the ABC approach discussed in the next section.…”
Section: Neural Network Training Using Optimized Bpmentioning
confidence: 99%
See 3 more Smart Citations