1995
DOI: 10.1109/72.363480
|View full text |Cite
|
Sign up to set email alerts
|

Principal component extraction using recursive least squares learning

Abstract: Abstract-A new neural network-based approach is introduced for recursive computation of the principal components of a stationary vector stochastic process. The neurons of a single layer network are sequentially trained using a recursive least squares squares (RLS) type algorithm to extract the principal components of the input process. The optimality criterion is based on retaining the maximum information contained in the input sequence so as to be able to reconstruct the network inputs from the corresponding … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
42
0

Year Published

1999
1999
2019
2019

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 116 publications
(42 citation statements)
references
References 13 publications
0
42
0
Order By: Relevance
“…For example, the LDA has been widely used for dimensionality reduction in speech recognition [6]. Miao and Hua [7] used an objective function and presented gradient descent and recursive least squares (RLS) algorithms [8] for adaptive principal subspace analysis (PSA). Xu [9] used a different objective function to derive an algorithm for adaptive PSA by applying the gradient descent optimization method.…”
Section: Introductionmentioning
confidence: 99%
“…For example, the LDA has been widely used for dimensionality reduction in speech recognition [6]. Miao and Hua [7] used an objective function and presented gradient descent and recursive least squares (RLS) algorithms [8] for adaptive principal subspace analysis (PSA). Xu [9] used a different objective function to derive an algorithm for adaptive PSA by applying the gradient descent optimization method.…”
Section: Introductionmentioning
confidence: 99%
“…Diamantaras and Kung (1994) exploited the idea of using lateral connections with anti-Hebbian learning to recursively extract the principal components. In a different approach, based on recursive least squares (RLS) learning, Bannour and Azimi-Sadjadi (1995) proposed another structure for recursive extraction of principal components. Readers are referred to Diamantaras and Kung (1996) for a review of other related work in this area.…”
Section: Introductionmentioning
confidence: 99%
“…Several authors introduced modifications of the original Oja rule where the learning rate for 1 Web site: www.ti.uni-bielefeld.de the weight update is adjusted according to an eigenvalue estimate, e.g. Projection Approximation Subspace Tracking [2], learning rules based on recursive least square approaches [3,4,5,6], and the Adaptive Learning Algorithm [7]. A number of these "coupled" learning rules, which simultaneously estimate eigenvectors and eigenvalues in a coupled system of equations, can be derived from a common framework by applying Newton's method to an information criterion [8].…”
Section: Introductionmentioning
confidence: 99%