[Proceedings] ICASSP 91: 1991 International Conference on Acoustics, Speech, and Signal Processing 1991
DOI: 10.1109/icassp.1991.150846
|View full text |Cite
|
Sign up to set email alerts
|

Recursive node creation in back-propagation neural networks using orthogonal projection method

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

1991
1991
2001
2001

Publication Types

Select...
3
2

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 6 publications
0
5
0
Order By: Relevance
“…The order update equation and the node update formulation can be obtained by appending a column vector to the column space of un) [3]. When a new node N is added (see Figure 2), assuming that the weights to this node are randomly initialized, the corresponding vector is 7411) = [0, 0, ... , z4n)lZ = zdn) G, where zdn) is the output of the added node.…”
Section: Order Updating and Recursive Node Creationmentioning
confidence: 99%
See 1 more Smart Citation
“…The order update equation and the node update formulation can be obtained by appending a column vector to the column space of un) [3]. When a new node N is added (see Figure 2), assuming that the weights to this node are randomly initialized, the corresponding vector is 7411) = [0, 0, ... , z4n)lZ = zdn) G, where zdn) is the output of the added node.…”
Section: Order Updating and Recursive Node Creationmentioning
confidence: 99%
“…The approach in [3] is used in this paper to arrive at a step by step procedure for recursive dynamic node creation during the RLS-based learning process. The simulation results on target detection and classification from microwave data are presented which indicate the effectiveness of the algorithm for real world applications.…”
Section: Introductionmentioning
confidence: 99%
“…Azimi-Sadjadi and Sheedvash (1991) and Sin and de Figueiredo (1992) have used the RLS algorithm in training multilayer perceptrons (MLP). Other related work is the use of the extended Kalman filter (EKF) algorithm, which is similar in form to RLS, but allows one to incorporate knowledge or estimates of noise variances in the data.…”
Section: Introductionmentioning
confidence: 99%
“…The sum of the squared error is viewed as the squared length (or norm) of an error vector which is minimized using the geometric approach. It will be shown that the solution of the time updating leads to the RLS adaptation [9], [10], and the solution to the order updating allows us to recursively add nodes to the hidden layers during the training process.…”
Section: Training Process Of Multilayer Neural Networkmentioning
confidence: 99%
“…The sum of the squared error is viewed as the squared length (or norm) of an error vector which is minimized using the geometric approach. It will be shown that the solution of the time updating leads to the RLS adaptation [9], [10], and the solution to the order updating allows us to recursively add nodes to the hidden layers during the training process.Consider an M-layer network as shown in Fig. 1 Abstract-This paper presents the derivations of a novel approach for simultaneous recursive weight adaptation and node creation in multilayer back-propagation neural networks.…”
mentioning
confidence: 99%