2020
DOI: 10.1016/j.amc.2019.124783
|View full text |Cite
|
Sign up to set email alerts
|

Low-rank tensor train for tensor robust principal component analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
42
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 84 publications
(42 citation statements)
references
References 30 publications
0
42
0
Order By: Relevance
“…The most significant application of principal component analysis is the implementation of a multivariate record board as a small set of variables with the aim of viewing trends, bounds, groups, and outliers [51]. Principal component analysis is a very flexible device and allows the study of datasets that may include, for example, multicollinearity, misplaced values, and categorical records, in addition to incorrect data [52]. The objective is to detect and express significant data records as a set of primary indices or main components [51,53].…”
Section: Urban Change Detectionmentioning
confidence: 99%
“…The most significant application of principal component analysis is the implementation of a multivariate record board as a small set of variables with the aim of viewing trends, bounds, groups, and outliers [51]. Principal component analysis is a very flexible device and allows the study of datasets that may include, for example, multicollinearity, misplaced values, and categorical records, in addition to incorrect data [52]. The objective is to detect and express significant data records as a set of primary indices or main components [51,53].…”
Section: Urban Change Detectionmentioning
confidence: 99%
“…However, the spatial resolution is inevitably decreased because of the limited sun irradiance [3]. Thus, we need to utilize some techniques to enhance the spatial quality of the HSIs [4]- [11]. We know that MSIs are obtained with poor spectral resolution but abundant spatial resolution compared with HSIs.…”
Section: Introductionmentioning
confidence: 99%
“…Since there exists correlations in the practical datasets such as images and user ratings, the resulting tensor data is often low-rank. The low-rank property has been exploited in problems like low-rank tensor completion [25,26,27,28,29,30] and low-rank tensor recovery [31,32,33,34,35]. Leveraging the low-rank property, a convex relaxation of the Tucker rank can be applied to robust tensor recovery [31,32] and tensor completion [25,26].…”
Section: Introductionmentioning
confidence: 99%