2007
DOI: 10.1016/j.ins.2007.02.021
|View full text |Cite
|
Sign up to set email alerts
|

Comparing evolutionary hybrid systems for design and optimization of multilayer perceptron structure along training parameters

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
18
0
2

Year Published

2008
2008
2017
2017

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 27 publications
(20 citation statements)
references
References 85 publications
0
18
0
2
Order By: Relevance
“…Advantages of 2LP are simplicity and high speed of classification [12], [15], [16]. Quick resetting and resource-saving by low CEP are believed to be reachable if 2LP is optimised by its hidden layer neuron number (HLNN) along with the training parameters [12], [17], [18]. The criterion of optimisation is minimisation of CEP, which depends on the training parameters.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Advantages of 2LP are simplicity and high speed of classification [12], [15], [16]. Quick resetting and resource-saving by low CEP are believed to be reachable if 2LP is optimised by its hidden layer neuron number (HLNN) along with the training parameters [12], [17], [18]. The criterion of optimisation is minimisation of CEP, which depends on the training parameters.…”
Section: Related Workmentioning
confidence: 99%
“…This integer might be determined by R and B . But bonds between these three integers are implicit [17], [20], [21].…”
Section: Training Parametersmentioning
confidence: 99%
“…In particular, however, the speed is more than adequate for the applications we have used it (Araujo et al 2008;Castillo et al 2002Castillo et al , 2008Merelo-Guervós and CastilloValdivieso 2004;Castillo et al 2007, for instance), and the availability of auxiliary libraries in Perl and the speed with which development and experiment processing afterwards can be done more than compensates for its lack of speed. Running a profiling tool over the programs also helps to identify the bottlenecks, which can then be attacked by using different techniques or implementations in other languages.…”
Section: Fitness Classesmentioning
confidence: 99%
“…Castillo et al [12] explored several methods that combine evolutionary algorithms and local search to optimize multilayer perceptrons. Authors explored a method that optimizes the architecture and initial weights of multilayer perceptrons, a search algorithm for training algorithm parameters, and finally, a co-evolutionary algorithm, that handles the architecture, the network's initial weights and the training algorithm parameters.…”
Section: Recent Applications Of Evolutionary Neural Network In Practicementioning
confidence: 99%