2019
DOI: 10.1080/01621459.2018.1527227
|View full text |Cite
|
Sign up to set email alerts
|

Optimal Sparse Singular Value Decomposition for High-Dimensional High-Order Data

Abstract: In this article, we consider the sparse tensor singular value decomposition, which aims for dimension reduction on high-dimensional high-order data with certain sparsity structure.A method named sparse tensor alternating thresholding for singular value decomposition (STAT-SVD) is proposed. The proposed procedure features a novel double projection & thresholding scheme, which provides a sharp criterion for thresholding in each iteration.Compared with regular tensor SVD model, STAT-SVD permits more robust estima… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
37
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 55 publications
(37 citation statements)
references
References 34 publications
0
37
0
Order By: Relevance
“…In addition, various dimension-reduced structures are imposed on the singular vectors or subspace of the low-rank tensor decomposition model to better capture other intrinsic properties of the data, such as sparsity (Sun et al, 2017;Zhang and Han, 2019), blocking (Han et al, 2020), non-negativity (Xu and Yin, 2013) and many others. The most relevant paper to our work is the spatial/temporal structure, which is incorporated to characterize time/location-varying patterns for one or more of the tensor modes.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…In addition, various dimension-reduced structures are imposed on the singular vectors or subspace of the low-rank tensor decomposition model to better capture other intrinsic properties of the data, such as sparsity (Sun et al, 2017;Zhang and Han, 2019), blocking (Han et al, 2020), non-negativity (Xu and Yin, 2013) and many others. The most relevant paper to our work is the spatial/temporal structure, which is incorporated to characterize time/location-varying patterns for one or more of the tensor modes.…”
Section: Related Workmentioning
confidence: 99%
“…The resulting number of estimated eigenfunctions is often in the same order as the number of variables, making the eigenfunctions hard to interpret when the variable mode is high-dimensional. Another important class of works utilize the high-order structure of the tensor data via different types of low-rank tensor decomposition models, such as (sparse) CP low-rankness (Anandkumar et al, 2014b;Sun et al, 2017), (sparse) Tucker lowrankness (Zhang and Xia, 2018;Zhang and Han, 2019), tensor-train (Oseledets, 2011;Zhou et al, 2020b), or decomposable tensor covariance (Dawid, 1981;Tsiligkaridis and Hero, 2013;Zhou, 2014).…”
Section: Introductionmentioning
confidence: 99%
“…where SV D r represents the first r left singular vectors of the matrix. To address the high computation complexity issue, we use the latest development of the HOSVD method, the Sparse Tensor Alternating Thresholding SVD (STAT-SVD), to truncate after each projection before SVD and QR [36].…”
Section: Spatiotemporal Profile Tensor Inferencementioning
confidence: 99%
“…Schmidt [5] further proposed an approximation theorem, proving that SVD can be used to obtain the optimal low-rank approximation of an operator. Beyond these classic work, SVD theories continue to expand in recent decades [6,7,8,9,10], and theoretical works with application purpose are also actively studied especially in recent years [11,12,13,14].…”
Section: Introductionmentioning
confidence: 99%