2019
DOI: 10.1609/aaai.v33i01.33019151
|View full text |Cite
|
Sign up to set email alerts
|

Tensor Ring Decomposition with Rank Minimization on Latent Space: An Efficient Approach for Tensor Completion

Abstract: In tensor completion tasks, the traditional low-rank tensor decomposition models suffer from the laborious model selection problem due to their high model sensitivity. In particular, for tensor ring (TR) decomposition, the number of model possibilities grows exponentially with the tensor order, which makes it rather challenging to find the optimal TR decomposition. In this paper, by exploiting the low-rank structure of the TR latent space, we propose a novel tensor completion method which is robust to model se… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
76
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
3
1

Relationship

2
6

Authors

Journals

citations
Cited by 137 publications
(77 citation statements)
references
References 16 publications
1
76
0
Order By: Relevance
“…The TRNNM algorithm is summarized in Algorithm 1. ) respectively, the computational complexity of our method is may much smaller when I ≤ R. In practice, as shown in [3,5], the suitable TR-rank R is always much higher than the dimension I in the high-order form of visual data, which is also found in our experiment. In addition, due to the immense TR-rank selection possibilities, the computational complexity of TR-ALS and TRLRF exponentially increases by using cross validation.…”
Section: Tensor Ring Nuclear Norm Minimizationsupporting
confidence: 72%
See 2 more Smart Citations
“…The TRNNM algorithm is summarized in Algorithm 1. ) respectively, the computational complexity of our method is may much smaller when I ≤ R. In practice, as shown in [3,5], the suitable TR-rank R is always much higher than the dimension I in the high-order form of visual data, which is also found in our experiment. In addition, due to the immense TR-rank selection possibilities, the computational complexity of TR-ALS and TRLRF exponentially increases by using cross validation.…”
Section: Tensor Ring Nuclear Norm Minimizationsupporting
confidence: 72%
“…Low rank tensor completion (LRTC) problem aims to recover the incomplete tensor from the observed entries by assuming different low-rank tensor structures, and it has attracted a lot of attentions in the past decades [1][2][3][4][5]. Most recently, Zhao et al proposed tensor ring decomposition [6], which achieves the state-of-the-art performance in the LRTC problem [3][4][5].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…This method reduces computational cost but still requires a pre-defined TR rank. By exploiting the low rank structure of the TR latent space, a nuclear norm regularization model is propounded to alleviate the burden of TR rank selection [20], which greatly reduces the computational cost. In [21], a TR nuclear norm minimization model with tensor circular unfolding scheme is proposed for tensor completion.…”
Section: Introductionmentioning
confidence: 99%
“…There are also tensor completion algorithms which are based on the other tensor decomposition models, i.e., tensor ring (TR) decomposition [23,24] and hierarchical Tucker (HT) decomposition. Based on TR decomposition, works in [25,26,27] propose algorithms named TR-ALS, TR-WOPT and TRLRF which apply ALS, gradient descent and nuclear norm minimization methods to solve various tensor completion problems. Moreover, by total variations (TV) and HT decomposition, [28] proposes a completion algorithm named STTC, which explores the global low-rank tensor structure and the local correlation structure of the data simultaneously.…”
Section: Introductionmentioning
confidence: 99%