Proceedings of 27th Asilomar Conference on Signals, Systems and Computers
DOI: 10.1109/acssc.1993.342540
|View full text |Cite
|
Sign up to set email alerts
|

Constructive proof of efficient pattern storage in the multi-layer perceptron

Abstract: In this paper, we

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 10 publications
(7 citation statements)
references
References 9 publications
0
7
0
Order By: Relevance
“…In OWO-BP which uses OWO to update output weights and BP to update hidden weights, linear equations are solved to find output weights and BP is used to find hidden weights [13]. In our approach we have adapted the idea [14] of minimizing a separate error function for each hidden unit to find the hidden weights, termed as HWO.…”
Section: Neural Network With Owo and Hwomentioning
confidence: 99%
See 1 more Smart Citation
“…In OWO-BP which uses OWO to update output weights and BP to update hidden weights, linear equations are solved to find output weights and BP is used to find hidden weights [13]. In our approach we have adapted the idea [14] of minimizing a separate error function for each hidden unit to find the hidden weights, termed as HWO.…”
Section: Neural Network With Owo and Hwomentioning
confidence: 99%
“…It is a complex and nonlinear mapping or identification problem to forecast coal and gas outburst correctly. Various prediction techniques such as experience and knowledge-based method and neural nets (NN) [3], [5], [9], [10], [13] have been proposed. Nonlinear system identification via neural networks consists in adjusting the network in such a way that it approximately describes the inputoutput mapping of the system.…”
Section: Introductionmentioning
confidence: 99%
“…( 2) once trained, the MLP can be applied to data one or more orders of magnitude faster than the NNE's; ( 3) a MLP can closely approximate the performance of a NNE if it can process the NNE's cluster vectors without error (memorize the clusters and their associated output vectors); and ( 4 ) a MLP or Volterra ® lter can memorize as many patterns as it has free parameters per output node (its complexity) (Gopalakrishnan et al 1994).…”
Section: Complexity Estimationmentioning
confidence: 99%
“…Unfortunately, backpropagation is not a very effective method for updating hidden weights [15,29]. Some researchers [11,16,17,20,31] have used the Levenberg-Marquardt(LM) method to train the multilayer perceptron.…”
Section: Introductionmentioning
confidence: 99%