2012
DOI: 10.1007/s12532-012-0044-1
|View full text |Cite
|
Sign up to set email alerts
|

Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm

Abstract: Abstract. The matrix completion problem is to recover a low-rank matrix from a subset of its entries. The main solution strategy for this problem has been based on nuclear-norm minimization which requires computing singular value decompositions -a task that is increasingly costly as matrix sizes and ranks increase. To improve the capacity of solving large-scale problems, we propose a low-rank factorization model and construct a nonlinear successive over-relaxation (SOR) algorithm that only requires solving a l… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
600
1

Year Published

2013
2013
2018
2018

Publication Types

Select...
7
2

Relationship

1
8

Authors

Journals

citations
Cited by 697 publications
(602 citation statements)
references
References 27 publications
1
600
1
Order By: Relevance
“…In this sense, if the algorithm has converged, we can a posteriori use the boundedness of the sequence to show that the limit point is a local stationary point. This is a slightly different guarantee than that proved in [20].…”
Section: Proximal Gradient Methodsmentioning
confidence: 73%
“…In this sense, if the algorithm has converged, we can a posteriori use the boundedness of the sequence to show that the limit point is a local stationary point. This is a slightly different guarantee than that proved in [20].…”
Section: Proximal Gradient Methodsmentioning
confidence: 73%
“…We compared our algorithm in [9] with LMaFit [11] and FPCA [12] on recovering three-dimensional hyperspectral images from their incomplete observations. Our test hyperspectral datacube has 163 slices, and the size of each slice is 80 × 80.…”
Section: Numerical Resultsmentioning
confidence: 99%
“…A further saving is achieved in the "LMaFit" algorithm proposed by Wen et al [277]. Here the basic Proximal-ALS iteration is carried out with only one QR factorization which is used to generate an orthonormal basis to the column space of the matrix A ℓ V ℓ .…”
Section: The Lmafit Algorithmmentioning
confidence: 99%
“…Otherwise, when the columns of V ℓ are linearly dependent, the proof is concluded by using the SVD of V ℓ , see [277].…”
Section: The Lmafit Algorithmmentioning
confidence: 99%