2011
DOI: 10.1007/978-3-642-20573-6_11
|View full text |Cite
|
Sign up to set email alerts
|

Improved Back Propagation Algorithm to Avoid Local Minima in Multiplicative Neuron Model

Abstract: The back propagation algorithm calculates the weight changes of artificial neural networks, and a common approach is to use a training algorithm consisting of a learning rate and a momentum factor. The major drawbacks of above learning algorithm are the problems of local minima and slow convergence speeds. The addition of an extra term, called a proportional factor reduces the convergence of the back propagation algorithm. We have applied the three term back propagation to multiplicative neural network learnin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
14
0
1

Year Published

2012
2012
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 26 publications
(15 citation statements)
references
References 12 publications
0
14
0
1
Order By: Relevance
“…This model is based on a polynomial structure. The simulation results show the effectiveness of the neuron model [2,14,13,17]. The SMN neuron model is based on the concept of arithmetic mean of the multiplicative inputs.…”
Section: Generalized Mean Single Multiplicative Neuron (Gmsmn) Modelmentioning
confidence: 92%
See 2 more Smart Citations
“…This model is based on a polynomial structure. The simulation results show the effectiveness of the neuron model [2,14,13,17]. The SMN neuron model is based on the concept of arithmetic mean of the multiplicative inputs.…”
Section: Generalized Mean Single Multiplicative Neuron (Gmsmn) Modelmentioning
confidence: 92%
“…The SMN model is first proposed in [2] for solving various problems in [12][13][14][15][16][17]. This model is based on a polynomial structure.…”
Section: Generalized Mean Single Multiplicative Neuron (Gmsmn) Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…Despite the general success of back-propagation in learning the network's parameter, several major de¯ciencies are still needed to be solved. First, the back-propagation algorithm will get trapped in local minima especially for nonlinearly separable problems, 18,19 such as the XOR problem. 20,21 Having trapped into local minima, back-propagation may lead to failure in¯nding a global optimal solution.…”
Section: Converge and Stability Analysismentioning
confidence: 99%
“…The main problem of the machine learning is to minimize the cost function E with a suitable choice of weights W . A gradient method, described above and called backpropagation in the context of neural network training, can get stuck in local minima or take very long time to run in order to optimize E. This is due to the fact that general properties of the cost surface are usually unknown and only the trial and error numerical methods are available (see [4], [12], [16], [9], [17], [18], [5])). No theoretical approach is known to provide the exact initial weights in backpropagation with guaranteed convergence to the global minima of E. One of most powerful techniques used in backpropagation is the adaptive learning rate selection [8] where the step size of iterations is gradually raised in order to escape a local minimum.…”
mentioning
confidence: 99%