2022
DOI: 10.1007/s10915-022-01917-5
|View full text |Cite
|
Sign up to set email alerts
|

Randomized Quaternion QLP Decomposition for Low-Rank Approximation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 42 publications
0
2
0
Order By: Relevance
“…The complexity of quaternion products motivates the study of fast algorithms for quaternion matrix decomposition. In recent years, by virtue of the real structure-preserving algorithms, many excellent results have been obtained on quaternion LU decomposition, quaternion singular value decomposition, and eigenvalue decomposition of quaternion Hermitian matrices [9][10][11][12][13][14][15][16][17][18]. Similarly, commutative quaternion gives rise to a multitude of operations depending on the composition.…”
Section: Introductionmentioning
confidence: 99%
“…The complexity of quaternion products motivates the study of fast algorithms for quaternion matrix decomposition. In recent years, by virtue of the real structure-preserving algorithms, many excellent results have been obtained on quaternion LU decomposition, quaternion singular value decomposition, and eigenvalue decomposition of quaternion Hermitian matrices [9][10][11][12][13][14][15][16][17][18]. Similarly, commutative quaternion gives rise to a multitude of operations depending on the composition.…”
Section: Introductionmentioning
confidence: 99%
“…In addition to rank minimization, recent studies have utilized quaternion matrix decomposition and randomized techniques to improve the performance of LRQA [17,27,28]. For instance, Miao et al [29] suggested the matrix factorization for the target quaternion matrix, followed by three quaternion-based bilinear factor matrix norm factorization methods for low-rank quaternion matrix completion, which can avoid expensive calculations.…”
Section: Introductionmentioning
confidence: 99%