Sixth International Conference on Intelligent Systems Design and Applications 2006
DOI: 10.1109/isda.2006.95
|View full text |Cite
|
Sign up to set email alerts
|

An Improved Learning Algorithm Based on The Broyden-Fletcher-Goldfarb-Shanno (BFGS) Method For Back Propagation Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
36
0
1

Year Published

2007
2007
2022
2022

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 51 publications
(38 citation statements)
references
References 16 publications
1
36
0
1
Order By: Relevance
“…This algorithm depends upon several parameters such as a number of hidden nodes at the hidden layers 'learning rate, momentum rate, activation function and the number of training to take place. Furthermore, these parameters could change the performance on the learning from bad to good accuracy [23].…”
Section: Neural Networkmentioning
confidence: 99%
“…This algorithm depends upon several parameters such as a number of hidden nodes at the hidden layers 'learning rate, momentum rate, activation function and the number of training to take place. Furthermore, these parameters could change the performance on the learning from bad to good accuracy [23].…”
Section: Neural Networkmentioning
confidence: 99%
“…First, a novel approach for improving the training efficiency of gradient descent method (BP algorithm) which was presented by Nawi et al [12] are discussed. Their method modified the initial search direction by changing the gain value adaptively for each node.…”
Section: Methodsmentioning
confidence: 99%
“…Algorithm [12] : Initialize the weight vector with random values and the vector of gain values with one. Repeat the following steps 1, 2 and 3 on an epoch-by-epoch basis until the given error minimization criteria are satisfied.…”
Section: Methodsmentioning
confidence: 99%
“…The error back propagation training was provided by Broyden-Fletcher-Goldfarb-Shanno (BFGS) training algorithm which is well suited for the unconstrained nonlinear multi-dimensional problems [25]. This algorithm provided a fast and efficient back propagation neural network training by adaptively modifying the initial search direction to improve the training efficiency [26].…”
Section: Artificial Neural Network Modelingmentioning
confidence: 99%