2012
DOI: 10.5120/8143-1891
|View full text |Cite
|
Sign up to set email alerts
|

Performance Analysis of Neural Networks Training using Real Coded Genetic Algorithm

Abstract: Multilayer perceptrons (MLPs) are widely used for pattern classification and regression problems. Backpropagation (BP) algorithm is known technique in the training of multilayer perceptrons. However for its optimum training convergence, the learning and momentum parameters need to be tuned on trial and error method. Further, sometimes the backpropagation algorithm fails to achieve global convergence. To alleviate these problems we suggest a genetic algorithm based training for MLP network. Both binary coded an… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2013
2013
2024
2024

Publication Types

Select...
3
1
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 20 publications
0
2
0
Order By: Relevance
“…In the past few years, pattern classification using feed-forward neural networks (FNN), in specific, multilayer perceptron (MLP) is considered as promising neural network model [2] due to its capability to classify the real world complex problem without prior knowledge on the problem domain. In classification, first the network is trained on a set of paired data from the dataset to evolve a set of free network parameters that is, selection of optimal weights of the network weights and second, then the network is ready to test a new set of data [5].Training the weights of the multilayer perceptron can be considered as an optimization problem in which the network weights are optimized.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…In the past few years, pattern classification using feed-forward neural networks (FNN), in specific, multilayer perceptron (MLP) is considered as promising neural network model [2] due to its capability to classify the real world complex problem without prior knowledge on the problem domain. In classification, first the network is trained on a set of paired data from the dataset to evolve a set of free network parameters that is, selection of optimal weights of the network weights and second, then the network is ready to test a new set of data [5].Training the weights of the multilayer perceptron can be considered as an optimization problem in which the network weights are optimized.…”
Section: Introductionmentioning
confidence: 99%
“…J. D. Schaffer et al [17] wrote a survey on combinations of genetic algorithms and neural networks. The performance of back-propagation algorithm compared with binary and real-coded genetic algorithms for training multi-layer perceptron has been referred in [2,3]. Iloren et al [1] compared back-propagation with differential evolution for neural networks training.…”
Section: Introductionmentioning
confidence: 99%