International Conference on Acoustics, Speech, and Signal Processing
DOI: 10.1109/icassp.1990.115644
|View full text |Cite
|
Sign up to set email alerts
|

Supervised learning process of multi-layer perceptron neural networks using fast least squares

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
11
0

Publication Types

Select...
4
2

Relationship

2
4

Authors

Journals

citations
Cited by 14 publications
(11 citation statements)
references
References 5 publications
0
11
0
Order By: Relevance
“…For neural networks, however, the seriality property can not be assumed for the input and output sequences in each layer. Thus it is inevitable to use a RLS-based scheme [3], [4] for the weight adaptation procedure. It can be shown that the updating equation (28) is equivalent to that of the standard RLS by expressing the terms in Cj ( n)h N(n) and CN(n) using the RLS equations [13]; namely…”
Section: A Time Updating and Weight Adaptationmentioning
confidence: 99%
See 3 more Smart Citations
“…For neural networks, however, the seriality property can not be assumed for the input and output sequences in each layer. Thus it is inevitable to use a RLS-based scheme [3], [4] for the weight adaptation procedure. It can be shown that the updating equation (28) is equivalent to that of the standard RLS by expressing the terms in Cj ( n)h N(n) and CN(n) using the RLS equations [13]; namely…”
Section: A Time Updating and Weight Adaptationmentioning
confidence: 99%
“…Equations (28), (31a), and (31b) represent the RLS algorithm developed in [3], [4] and CN(n) is the relevant gain matrix. Note that c~(n) is the a priori error (i.e., before updating) whereas C j (n) represents the a posteriori error (i.e., after updating).…”
Section: A Time Updating and Weight Adaptationmentioning
confidence: 99%
See 2 more Smart Citations
“…(-) in all appropriate nodes, in principle any least square error algorithm can be used to update the weights. Algorithms based on similar ideas for updating weights one node at a time are given by Azimi-Sadjadi et al [5] (henceforth, A-S algorithm) and by Hunt and Deller [9]. The former is based on the conventional RLS algorithm [2] with a "If any weight connected a node is to be updated, then every weight connected to that node must be updated.…”
Section: Wvmentioning
confidence: 99%