2018
DOI: 10.1109/jproc.2018.2844126
|View full text |Cite
|
Sign up to set email alerts
|

Static and Dynamic Robust PCA and Matrix Completion: A Review

Abstract: Principal Components Analysis (PCA) is one of the most widely used dimension reduction techniques. Robust PCA (RPCA) refers to the problem of PCA when the data may be corrupted by outliers. Recent work by Candès, Wright, Li, and Ma defined RPCA as a problem of decomposing a given data matrix into the sum of a low-rank matrix (true data) and a sparse matrix (outliers). The column space of the low-rank matrix then gives the PCA solution. This simple definition has lead to a large amount of interesting new work o… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
28
0
1

Year Published

2019
2019
2023
2023

Publication Types

Select...
7
2

Relationship

1
8

Authors

Journals

citations
Cited by 64 publications
(29 citation statements)
references
References 75 publications
0
28
0
1
Order By: Relevance
“…Computing run times of simple AI diagnostic modules (i.e., auto-encoders and support vector machines), advanced convolutional or recurrent neural networks, and other DL algorithms depend greatly on the dimensionality of the inputted data. To achieve computing efficiencies, data scientists use a wide variety of dimensionality reduction and feature selection techniques: principal component analysis (PCA), 2 generalized spike models, 22 robust PCA, 23,24 PCA whitening, 25 robust subspace tracking, 24 low rank plus sparse [L + S] data decomposition 26 and algorithms (i.e., t -distributed stochastic neighbor embedding). 27 With proper preprocessing of dynamic datasets, AI technologies are becoming more efficient at signal processing (i.e., satellite communications and seismology), computer vision (i.e., video surveillance and traffic patterns), and network traffic analysis.…”
Section: Introductionmentioning
confidence: 99%
“…Computing run times of simple AI diagnostic modules (i.e., auto-encoders and support vector machines), advanced convolutional or recurrent neural networks, and other DL algorithms depend greatly on the dimensionality of the inputted data. To achieve computing efficiencies, data scientists use a wide variety of dimensionality reduction and feature selection techniques: principal component analysis (PCA), 2 generalized spike models, 22 robust PCA, 23,24 PCA whitening, 25 robust subspace tracking, 24 low rank plus sparse [L + S] data decomposition 26 and algorithms (i.e., t -distributed stochastic neighbor embedding). 27 With proper preprocessing of dynamic datasets, AI technologies are becoming more efficient at signal processing (i.e., satellite communications and seismology), computer vision (i.e., video surveillance and traffic patterns), and network traffic analysis.…”
Section: Introductionmentioning
confidence: 99%
“…The search for robust methods of PCA is a very active area of research in statistics and machine learning. 19 This search is sometimes divided into robust PCA, where individual matrix elements of M are outliers, 20 and robust subspace recovery, where entire columns may be outliers. 21 We have considered L1-norm PCA, 22,23,24,25 in which principal components minimize not the variance of the data residuals (the L2 norm) but the absolute deviation of the residuals (the L1 norm).…”
Section: Robust Methods Of Pcamentioning
confidence: 99%
“…Ali et al [16] review the major tensor decomposition methods with a focus on problems targeted by classical PCA. Namrata et al [17] studied the dynamic (time-varying) version of the RPCA problem and proposed a series of provably correct, fast, and memoryefficient tracking solutions. The above studies have improved the normal vector estimation of the point cloud to some extent.…”
Section: Pca-based Normal Vector Estimation and Outlier Correctionmentioning
confidence: 99%