2018
DOI: 10.1109/tsp.2018.2853137
|View full text |Cite
|
Sign up to set email alerts
|

Subspace-Orbit Randomized Decomposition for Low-Rank Matrix Approximations

Abstract: An efficient, accurate and reliable approximation of a matrix by one of lower rank is a fundamental task in numerical linear algebra and signal processing applications. In this paper, we introduce a new matrix decomposition approach termed Subspace-Orbit Randomized singular value decomposition (SOR-SVD), which makes use of random sampling techniques to give an approximation to a low-rank matrix. Given a large and dense data matrix of size m × n with numerical rank k, where k min{m, n}, the algorithm requires a… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
27
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
7
1

Relationship

3
5

Authors

Journals

citations
Cited by 36 publications
(29 citation statements)
references
References 48 publications
(87 reference statements)
0
27
0
Order By: Relevance
“…It captures most attributes of the data by means of forming the smaller matrix through linear combinations of both rows and columns of the given matrix, and then applies the SVD for further computations. The work in [43] proposed a randomized algorithm termed subspace-orbit randomized SVD (SOR-SVD). SOR-SVD alternately projects the matrix onto its column and row space.…”
Section: Mathematical Model and Related Workmentioning
confidence: 99%
“…It captures most attributes of the data by means of forming the smaller matrix through linear combinations of both rows and columns of the given matrix, and then applies the SVD for further computations. The work in [43] proposed a randomized algorithm termed subspace-orbit randomized SVD (SOR-SVD). SOR-SVD alternately projects the matrix onto its column and row space.…”
Section: Mathematical Model and Related Workmentioning
confidence: 99%
“…This, as explained later on, reduces the computational costs of CoR-UTV compared to TSR-SVD. The key difference between CoR-UTV and SOR-SVD [68], however, lies in the computation of the compressed matrix D; SOR-SVD applies a truncated SVD and, as result, gives a rankk approximation to A, while CoR-UTV employs a columnpivoted QR decomposition and returns a rank-ℓ approximation. Nevertheless, the SVD is computationally more expensive than the column-pivoted QR, and standard techniques to computing it are challenging to parallelize [45], [46], [47].…”
Section: Outputmentioning
confidence: 99%
“…We borrow material from [68] since the two algorithms, CoR-UTV and SOR-SVD, have a few steps similar. However,…”
Section: Analysis Of Cor-utv Decompositionsmentioning
confidence: 99%
See 1 more Smart Citation
“…In order to find an unique mapping between the signal x and the measurement y, the constraint of sparsity on x can be utilized [9], [10], [11], [12], [13], [14], [15], [16], [17], [18], [19], [20], [21], [22], [23], [24], [25], [26], [27], [28], [29], [30], [31], [32], [33], [34], [35], [36], Corresponding author: Sheng Li (email: shengli@zjut.edu.cn). [37], [38], [39], [40], [41], [42], [43], [44], [45]. The sparse representation for x can be expressed as:…”
Section: Introductionmentioning
confidence: 99%