2020
DOI: 10.1109/tip.2020.2995061
|View full text |Cite
|
Sign up to set email alerts
|

Fast and Accurate Tensor Completion With Total Variation Regularized Tensor Trains

Abstract: We propose a new tensor completion method based on tensor trains. The to-be-completed tensor is modeled as a low-rank tensor train, where we use the known tensor entries and their coordinates to update the tensor train. A novel tensor train initialization procedure is proposed specifically for image and video completion, which is demonstrated to ensure fast convergence of the completion algorithm. The tensor train framework is also shown to easily accommodate Total Variation and Tikhonov regularization due to … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
18
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 48 publications
(18 citation statements)
references
References 32 publications
0
18
0
Order By: Relevance
“…(2) TMac [36]; (3) TMac-TT method [13]; (4) SiLRTC-TT method [13]; (5) PCLR method [37]; (6) TTC and TTC-TV method [28].…”
Section: Experiments and Results 41 Experimental Parameter Selectionmentioning
confidence: 99%
See 1 more Smart Citation
“…(2) TMac [36]; (3) TMac-TT method [13]; (4) SiLRTC-TT method [13]; (5) PCLR method [37]; (6) TTC and TTC-TV method [28].…”
Section: Experiments and Results 41 Experimental Parameter Selectionmentioning
confidence: 99%
“…Another completion model, which combined TT rank with TV is proposed in ref. [28] by assuming tensor train structures in the underlying regression model. This model is rephrased as a regression task and uses the alternating linear scheme to update tensor train cores, but the result also shows that the alternating direction method of multipliers (ADMM)-TV method performs better than this method in relative standard error (RSE) and peak signal-tonoise ratio (PSNR) scores.…”
Section: Introductionmentioning
confidence: 99%
“…Though still lacking in sufficient theoretical foundations and algorithmic variety when compared to its matrix-based counterpart, tensor completion has already a quite rich literature [1], which includes (among other approaches) methods based on lowrank decomposition models. In addition to the classical Canonical Polyadic Decomposition (CPD) and/or Tucker decomposition [7,8,9,10], less well-known models, such as t-SVD [11], tensor trains [12] (and tensor rings [2]), and Mdecomposition [13], have been studied in the context of the completion problem. The model order (e.g., the tensor rank in CPD) is almost always assumed a-priori known.…”
Section: Introductionmentioning
confidence: 99%
“…Recently, research in tensor completion (TC), which is a higher-order extension of matrix completion, has achieved a consistent performance in a variety of real-world applications [15][16][17][18]. Given a tensor with incomplete entries, tensor completion methods such as recent low-rank based tensor completion (LRTC) [19] and simultaneous tensor decomposition and completion (STDC) [20] use low-rank factorization to model the available data entries, which are then used for completion to recover the missing values [21][22][23]. TC methods have already been used in electroencephalography (EEG) to recover missing samples, showing better performances than other simple imputation methods [24,25].…”
Section: Introductionmentioning
confidence: 99%