2017
DOI: 10.1007/978-3-319-70139-4_1
|View full text |Cite
|
Sign up to set email alerts
|

Low-Rank and Sparse Matrix Completion for Recommendation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 16 publications
(9 citation statements)
references
References 17 publications
0
9
0
Order By: Relevance
“…While original for the sparse spike estimation problem, it must be noted that the study of nonconvex optimization schemes for linear inverse problems has gained attraction recently for different kinds of low-dimensional models. For low-rank matrix estimation, a smooth parametrization of the problem is possible and it has been shown that a RIP guarantees the absence of spurious minima [3,30]. In [29], a model for phase recovery with alternated projections and smart initialization is considered.…”
Section: Related Workmentioning
confidence: 99%
“…While original for the sparse spike estimation problem, it must be noted that the study of nonconvex optimization schemes for linear inverse problems has gained attraction recently for different kinds of low-dimensional models. For low-rank matrix estimation, a smooth parametrization of the problem is possible and it has been shown that a RIP guarantees the absence of spurious minima [3,30]. In [29], a model for phase recovery with alternated projections and smart initialization is considered.…”
Section: Related Workmentioning
confidence: 99%
“…Ning et al [47] proposed a sparse linear method, which multiplies the sparse aggregation and scoring matrices to obtain the prediction matrix. Zhao et al [57] proposed a low-rank and sparse matrix completion (LSMC) algorithm and verified that LSMC could be used to recommend products to users in food and movie datasets. LSMC relies only on the original interaction matrix to learn lowrank and sparse matrices.…”
Section: Low-rank and Sparse Matrix Factorizationmentioning
confidence: 99%
“…Zhao and Udell (2020b) propose to impute missing data by learning a Gaussian copula model from incomplete observation and shows empirically the resulting imputation achieves stateof-the-art performance. Following this line of work, Zhao and Udell (2020a) develop a low rank Gaussian copula that scales well to large datasets, and Zhao, Landgrebe, Shekhtman, and Udell (2022) extend the model to online imputation of a streaming dataset using online model updates. This article introduces an additional methodological advance by extending the Gaussian copula model to support truncated variables.…”
Section: Introductionmentioning
confidence: 99%