2021
DOI: 10.1137/20m1323448
|View full text |Cite
|
Sign up to set email alerts
|

Tensor Methods for Nonlinear Matrix Completion

Abstract: In the low-rank matrix completion (LRMC) problem, the low-rank assumption means that the columns (or rows) of the matrix to be completed are points on a low-dimensional linear algebraic variety. This paper extends this thinking to cases where the columns are points on a low-dimensional nonlinear algebraic variety, a problem we call low algebraic dimension matrix completion (LADMC). Matrices whose columns belong to a union of subspaces are an important special case. We propose an LADMC algorithm that leverages … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 16 publications
(10 citation statements)
references
References 40 publications
0
10
0
Order By: Relevance
“…We compare the proposed PMC methods with LRMC method (nulcear norm minimization), SRMC (Fan and Chow 2017), LADMC (Ongie et al 2018) (the iterative algorithm), VMC-2 (2-order polynomial kernel), VMC-3 (3order polynomial kernel) (Ongie et al 2017), and NLMC (Fan and Chow 2018) (RBF kernel) in matrix completion on synthetic data, subspace clustering on incomplete data, motion capture data recovery, and classification on incomplete data. Note that PMC with R 1 is equivalent to the NLMC method of .…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…We compare the proposed PMC methods with LRMC method (nulcear norm minimization), SRMC (Fan and Chow 2017), LADMC (Ongie et al 2018) (the iterative algorithm), VMC-2 (2-order polynomial kernel), VMC-3 (3order polynomial kernel) (Ongie et al 2017), and NLMC (Fan and Chow 2018) (RBF kernel) in matrix completion on synthetic data, subspace clustering on incomplete data, motion capture data recovery, and classification on incomplete data. Note that PMC with R 1 is equivalent to the NLMC method of .…”
Section: Methodsmentioning
confidence: 99%
“…The models have been proposed for subspace clustering (Elhamifar and Vidal 2013), manifold learning (Roweis and Saul 2000;Van Der Maaten, Postma, and Van den Herik 2009), and deep learning (Hinton and Salakhutdinov 2006). Related work Recently, matrix completion on multiplesubspace data and nonlinear data has drawn many researchers' attention (Eriksson, Balzano, and Nowak 2011;Yang, Robinson, and Vidal 2015;Li and Vidal 2016;Fan and Cheng 2018;Alameda-Pineda et al 2016;Elhamifar 2016;Fan, Zhao, and Chow 2018;Fan and Chow 2017;Ongie et al 2017;Ongie et al 2018). For example, (Elhamifar 2016) proposed a group-sparse optimization with rank-one constraints to complete and cluster multiple-subspace data.…”
Section: Introductionmentioning
confidence: 99%
“…Note that the truncated nuclear norm with r = 0 is equal to the nuclear norm. In this case, the problem ( 11) is same as the problem (16). When the variables X and D i are constant, the optimal solution for each Z i is obtained by…”
Section: Truncated Nuclear Norm-minimization Approachmentioning
confidence: 99%
“…In the same way, to solve the problem (11), this paper describes Algorithm 1 using iterative partial matrix shrinkage (IPMS) [9] for the problem (16), which contains the algorithm for (11). Here, 0 M,N ∈ R M ×N denotes a zero matrix, η 1 , η 2 denote lower limits for the termination conditions…”
Section: Truncated Nuclear Norm-minimization Approachmentioning
confidence: 99%
See 1 more Smart Citation