2003
DOI: 10.1006/mssp.2002.1510
|View full text |Cite
|
Sign up to set email alerts
|

Optimal Autoregressive Modelling of a Measured Noisy Deterministic Signal Using Singular-Value Decomposition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
12
0

Year Published

2007
2007
2020
2020

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 26 publications
(13 citation statements)
references
References 10 publications
1
12
0
Order By: Relevance
“…For a discrete signal X=[x(1), x(2), y, x(N)], a Hankel matrix can be created as follows: where 1on oN, let m =N À n+ 1, then AAR m  n , this matrix is also called trajectory matrix in some references [3,15]. The traditional noise reduction procedure of SVD can be summarized as follows [3,9,10,15]: first, matrix A is decomposed by SVD, so matrices U, D and V can be obtained. Second, in diagonal matrix D some significant singular values are reserved while the rest are set to zero, so a new diagonal matrix D 0 can be obtained.…”
Section: Signal Decomposition Principle Of Hankel Matrix-based Svdmentioning
confidence: 99%
See 1 more Smart Citation
“…For a discrete signal X=[x(1), x(2), y, x(N)], a Hankel matrix can be created as follows: where 1on oN, let m =N À n+ 1, then AAR m  n , this matrix is also called trajectory matrix in some references [3,15]. The traditional noise reduction procedure of SVD can be summarized as follows [3,9,10,15]: first, matrix A is decomposed by SVD, so matrices U, D and V can be obtained. Second, in diagonal matrix D some significant singular values are reserved while the rest are set to zero, so a new diagonal matrix D 0 can be obtained.…”
Section: Signal Decomposition Principle Of Hankel Matrix-based Svdmentioning
confidence: 99%
“…where U and V are the orthogonal matrix, UAR m  m , VAR n  n , D is the diagonal matrix, D=[diag (s 1 , s 2 , y, s q ), O] or its transposition, and this is decided by m on or m Zn, DAR m  n , O is the zero matrix, q=min(m, n), s 1 Zs 2 ZyZ s q 40. s i (i=1, 2, y, q) are called the singular values of matrix A. SVD method has been widely applied to many fields in recent years, such as data compression [1,2], system recognition [3], adaptive filter [4,5], principal component analysis (PCA) [6,7], noise reduction [8][9][10], faint signal extraction [11,12], machine condition monitoring [13] and so on. For example, Ahmed et al utilize SVD to compress the electrocardiogram (ECG) signal, their main idea is to transform the ECG signals to a rectangular matrix, compute its SVD, then discard the signals represented by the small singular values and only those signals represented by some big singular values are reserved so that ECG signal can be greatly compressed [1].…”
Section: Introductionmentioning
confidence: 99%
“…There are several criteria to determine this order. For example, Shin et al (2003) proposed a criterion based on the singular-value decomposition. The two most widely used methods are Akaike's information theoretic criterion (AIC) and Akaike's final prediction error (FPE), (Aguirre, 2004).…”
Section: Damage-sensitive Index Extractionmentioning
confidence: 99%
“…For example, the Singular Value Decomposition (SVD) methods are proposed by Shin et al [25,33,34]. Linear Discriminant Analysis (LDA) technology is adopted to represent face discriminant feature by J. Fabregas et al [2,4,17,18,23,24,26,27,35,39,45].…”
mentioning
confidence: 99%