2013
DOI: 10.5120/14641-2943
|View full text |Cite
|
Sign up to set email alerts
|

A Hybrid Differential Evolution and Back-Propagation Algorithm for Feedforward Neural Network Training

Abstract: In this study a hybrid differential evolution-back-propagation algorithm to optimize the weights of feedforward neural network is proposed.The hybrid algorithm can achieve faster convergence speed with higher accuracy. The proposed hybrid algorithm combining differential evolution (DE) and back-propagation (BP) algorithm is referred to as DE-BP algorithm to train the weights of the feed-forward neural (FNN) network by exploiting global searching feature of the DE evolutionary algorithm and strong local searchi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0

Year Published

2015
2015
2019
2019

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 27 publications
(17 citation statements)
references
References 17 publications
0
17
0
Order By: Relevance
“…This type of reasoning is not suitable in a logical context but can be observed in human behavior (Ding et al 2014). The proposed back propagation algorithm is used to train the weights of the feed-forward neural (FNN) network as the back propagation algorithm is strong towards local searching ability (Sarangi et al 2013). The Sect.…”
Section: Neural Network Modelmentioning
confidence: 99%
“…This type of reasoning is not suitable in a logical context but can be observed in human behavior (Ding et al 2014). The proposed back propagation algorithm is used to train the weights of the feed-forward neural (FNN) network as the back propagation algorithm is strong towards local searching ability (Sarangi et al 2013). The Sect.…”
Section: Neural Network Modelmentioning
confidence: 99%
“…Many researchers have recognized that the combination of global search methods such as GA, PSO, and DE with local search methods like BP, Levenberg‐Marquardt (LM), and so on are effective approaches for building approximation models of ANN. Some studies hybridize the DE with gradient‐based methods . Following this trend, this paper presents a new way of integration of the BP into the MDE, in which, the BP is applied after the selection phase through the searching process of the MDE, and it only uses the best individual as an initial point for searching.…”
Section: The Proposed Evolution Neural Narx Model For Nonlinear Systementioning
confidence: 99%
“…Curteanu et al proposed a new evolutionary neural method that applied a variant of DE using an opposition‐based learning initialization, a simple self‐adaptive procedure for the control parameters, and a modified mutation principle based on the fitness function as a criterion for reorganization. Authors in proposed the combination of a global search of the DE algorithm with local search methods like BP, Levenberg Marquardt (LM) to build an effective approximate approach for the neural network models.…”
Section: Introductionmentioning
confidence: 99%
“…This approach, on the other hand, easily falls into local optima and gets premature convergence [38,47]. In recent years, because of their promising self-organization and global optimization abilities, swarm intelligence algorithms, such as genetic algorithms (GAs) [40,48], particle swarm optimization (PSO) algorithms [46,49,50], and differential evolution (DE) [51] were successfully used to optimize the parameters of MLP.…”
Section: Introductionmentioning
confidence: 99%