1998
DOI: 10.1016/s0165-1684(98)00059-0
|View full text |Cite
|
Sign up to set email alerts
|

Performances analysis of a Givens parametrized adaptive eigenspace algorithm

Abstract: In this paper, we address an adaptive estimation method for eigenspaces of covariance matrices. We are interested in a gradient procedure based on coupled maximizations or minimizations of Rayleigh quotients where the constraints are replaced by a Givens parametrization. This enables us to provide a canonic orthonormal eigenbasis estimator. We study the convergence of this algorithm with the help of the associated ordinary differential equation (ODE), and propose a performance evaluation by computing the varia… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

1999
1999
2010
2010

Publication Types

Select...
2
2

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 24 publications
0
5
0
Order By: Relevance
“…, r. This rather intuitive computational process was confirmed by simulation results [60]. Later a formal analysis of the convergence and performance had been performed in [23] where it has been proved that the stationary points of the associated ODE are globally asymptotically stable (see Subsection VII-A) and that the stochastic algorithm (VI.1) converges almost surely to these points for stationary data x(k) when µ k is decreasing with lim k→∞ µ k = 0 and k µ k = ∞. We note that this algorithm yields exactly orthonormal r dominant or minor estimated eigenvectors by a simple change of sign in its step size, and requires O(nr) operations at each iteration but without accounting for the trigonometric functions.…”
Section: A Rayleigh Quotient-based Methodsmentioning
confidence: 62%
See 2 more Smart Citations
“…, r. This rather intuitive computational process was confirmed by simulation results [60]. Later a formal analysis of the convergence and performance had been performed in [23] where it has been proved that the stationary points of the associated ODE are globally asymptotically stable (see Subsection VII-A) and that the stochastic algorithm (VI.1) converges almost surely to these points for stationary data x(k) when µ k is decreasing with lim k→∞ µ k = 0 and k µ k = ∞. We note that this algorithm yields exactly orthonormal r dominant or minor estimated eigenvectors by a simple change of sign in its step size, and requires O(nr) operations at each iteration but without accounting for the trigonometric functions.…”
Section: A Rayleigh Quotient-based Methodsmentioning
confidence: 62%
“…T . This property has been exploited [23], [26] to reduce the computational cost of the previously introduced eigenvectors adaptive algorithms. Furthermore, the conditioning of these two independent EVD is improved with respect to the EVD of C x since the difference between two consecutive eigenvalues increases in general.…”
Section: E Particular Case Of Second-order Stationary Datamentioning
confidence: 99%
See 1 more Smart Citation
“…In order to remove the constraint V T V = I in the optimisation problem (44), we can represent an orthogonal V by its independent parameters based on Givens rotation parameterisation [19]. Specifically, when m = 2, any orthogonal V can be written as…”
Section: Optimal Realisation With the Smallest Dynamic Rangementioning
confidence: 99%
“…In the real case, we use the property that an orthonormal eigenbasis of a symmetric centro-symmetric matrix can be obtained from orthonormal eigenbases of two half-sized symmetric real matrices [5]. This property has already been used in [8,9] and in [19] for, respectively, a parameterized adaptive eigenspace algorithm and an adaptive eigen"lter bank. But no asymptotic performance analysis has been done yet.…”
Section: Introductionmentioning
confidence: 99%