IJCNN-91-Seattle International Joint Conference on Neural Networks
DOI: 10.1109/ijcnn.1991.155275
|View full text |Cite
|
Sign up to set email alerts
|

The effect of initial weights on premature saturation in back-propagation learning

Abstract: The back-propagation(BP) dgorithm is widely used for finding optimum weights of multi-layer neural networks in many pattern recognition applications. However, the critical drawback of the BP algorithm is its slow convergence of error. The major reason for this slow convergence is the "premature saturation" which is a phenomenon that the error of a neural network stays almost constant for some period of time during learning. It is known to be caused by an inappropriate set of initial weights. In this paper, the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
35
0

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 75 publications
(38 citation statements)
references
References 4 publications
0
35
0
Order By: Relevance
“…Such networks are good tools for forecasting issues, however, they have several limitations such as low learning speed and falling into local minima [32][33][34]. As mentioned in literatures [20,23,[35][36][37][38][39], using efficient optimization algorithms (OAs), these limitations can be overcome.…”
Section: Introductionmentioning
confidence: 99%
“…Such networks are good tools for forecasting issues, however, they have several limitations such as low learning speed and falling into local minima [32][33][34]. As mentioned in literatures [20,23,[35][36][37][38][39], using efficient optimization algorithms (OAs), these limitations can be overcome.…”
Section: Introductionmentioning
confidence: 99%
“…Although ANN can solve complex engineering problems, it has a number of disadvantages; for example, slow learning rate and getting trapped in local minima (e.g., [43]). In order to dominate the ANN problems, utilizing several optimization algorithms (OA) such as genetic algorithm (GA) for adjusting the weight and bias of ANNs to enhance the performance capacity of them, is of advantage.…”
Section: Introductionmentioning
confidence: 99%
“…Effective weight initialization is associated to performance characteristics such as the time needed to successfully train the network and the generalization ability of the trained network. Inappropriate weight initialization is very likely to increase the training time or even to cause non convergence of the training algorithm, while another unfortunate result may be to decrease the network's ability to generalize well, especially when training with backpropagation (BP), a procedure suffering from local minima, (Haykin, 1999;Hassoun, 1995;Lee et al, 1991). These are defaults and limitations for having successful practical application of neural networks in real life processes.…”
Section: Introductionmentioning
confidence: 99%