2016
DOI: 10.1016/j.ins.2015.07.049
|View full text |Cite
|
Sign up to set email alerts
|

Tensor completion using total variation and low-rank matrix factorization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
76
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
8
1

Relationship

4
5

Authors

Journals

citations
Cited by 161 publications
(76 citation statements)
references
References 34 publications
0
76
0
Order By: Relevance
“…It appears to be natural to directly extend low-rank matrix factorization methods to the low-rank tensor factorization problem. Based on this definition, Ji et al [48] further proposed a nonconvex approach. However, these methods involved the singular value decomposition (SVD) of Y (n) , which is time-consuming.…”
Section: Tensor Tucker Factorization Modelmentioning
confidence: 99%
“…It appears to be natural to directly extend low-rank matrix factorization methods to the low-rank tensor factorization problem. Based on this definition, Ji et al [48] further proposed a nonconvex approach. However, these methods involved the singular value decomposition (SVD) of Y (n) , which is time-consuming.…”
Section: Tensor Tucker Factorization Modelmentioning
confidence: 99%
“…The symmetric alternating direction method with multipliers (symmetric ADMM) is an acceleration method of ADMM, which can be used to solve the constraint optimization formulation in image processing [34][35][36][37][38][39][40][41][42][43].…”
Section: Symmetric Admmmentioning
confidence: 99%
“…Many optimization methods can be applied to solve the proposed formulation (P2), such as the split Bregman method, alternating direction method with multipliers [35][36][37][38][39][40]42,43,49,50]. Here, we employ symmetric ADMM to solve (P2) due to its simplicity and efficiency [34].…”
Section: Optimization Algorithmmentioning
confidence: 99%
“…Nowadays, the TV regularization is widely extended to other fields, such as nature image restoration [42,43] and tensor completion [44]. Comparing with Tikhonov-like regularization, TV regularization has a better ability to effectively preserve sharp edges and promote piecewise smooth objects.…”
Section: Tv Regularizationmentioning
confidence: 99%