2003
DOI: 10.1007/3-540-44868-3_46
|View full text |Cite
|
Sign up to set email alerts
|

Cooperative Co-evolution of Multilayer Perceptrons

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0

Year Published

2006
2006
2019
2019

Publication Types

Select...
4
2
2

Relationship

2
6

Authors

Journals

citations
Cited by 10 publications
(17 citation statements)
references
References 13 publications
0
17
0
Order By: Relevance
“…The complete description of the method and the results obtained using classification problems have been presented elsewhere [6], [7], [8], [9]. The designed method uses an elitist algorithm [29].…”
Section: The Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The complete description of the method and the results obtained using classification problems have been presented elsewhere [6], [7], [8], [9]. The designed method uses an elitist algorithm [29].…”
Section: The Methodsmentioning
confidence: 99%
“…We propose to focus the effort on the ANN optimization using GProp [6], [7], [8], [9], an evolutionary method for the design and optimization of neural networks.…”
Section: Introductionmentioning
confidence: 99%
“…A given fully connected FNN may become a partially connected network after learning. The GA equipped with the backpropagation (BP) [135] can be used to train both the architecture and the weights of the multilayer perceptron [128,136,137]. In [128], the Quickprop algorithm [138] is used to tune a solution and to reach the nearest local minimum from the solution found by the GA.…”
Section: Simultaneously Evolving Architecture and Parametersmentioning
confidence: 99%
“…This strategy avoids Lamarckism. The GProp method [136] is modified in [137] by integrating the Quickprop as a training operator, as performed in [128]. This method, thus, implements the Lamarckian evolution.…”
Section: Simultaneously Evolving Architecture and Parametersmentioning
confidence: 99%
“…A peculiar aspect is that BP is not used as some genetic operator, as it is the case in some related work [1]. Instead, the EA optimizes both the topology and the weights of the networks; BP is optionally used to decode a genotype into a phenotype NN.…”
Section: Neuro-genetic Approachmentioning
confidence: 99%