2018
DOI: 10.1109/tit.2018.2809782
|View full text |Cite
|
Sign up to set email alerts
|

Minimax Lower Bounds for Noisy Matrix Completion Under Sparse Factor Models

Abstract: This paper examines fundamental error characteristics for a general class of matrix completion problems, where the matrix of interest is a product of two a priori unknown matrices, one of which is sparse, and the observations are noisy. Our main contributions come in the form of minimax lower bounds for the expected per-element squared error for this problem under several common noise models. Specifically, we analyze scenarios where the corruptions are characterized by additive Gaussian noise or additive heavi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(6 citation statements)
references
References 29 publications
1
5
0
Order By: Relevance
“…Now the lower bound of candidate estimators for Poisson observations is given in the following proposition, whose proof follows a similar line of the matrix case in [42,Theorem 6]. For the sake of completeness, we give it here.…”
Section: Poisson Observationsmentioning
confidence: 86%
See 2 more Smart Citations
“…Now the lower bound of candidate estimators for Poisson observations is given in the following proposition, whose proof follows a similar line of the matrix case in [42,Theorem 6]. For the sake of completeness, we give it here.…”
Section: Poisson Observationsmentioning
confidence: 86%
“…where the first inequality follows from (27), the second inequality follows from (43), and the last inequality follows from (42). Note that…”
Section: Poisson Observationsmentioning
confidence: 99%
See 1 more Smart Citation
“…This demonstrates the upper bound in Theorem 4.1 is nearly optimal. [26,43], see also [54]. The key issue of the proof is to construct the packing sets for the core tensors and sparse factor matrices, respectively.…”
Section: Minimax Lower Boundsmentioning
confidence: 99%
“…In theory, Soni et al [46] proposed a noisy matrix completion method under a class of general noise models, and established the error bound of the estimator of their proposed model, which can reduce to sparse NMF and completion under nonnegative constraints. Moreover, Sambasivan et al [43] showed the error bound in [46] achieves minimax error rates up to multiplicative constants and logarithmic factors. However, for multi-dimensional data, the matrix based method may destroy the structure of a tensor via unfolding a tensor into a matrix.…”
Section: Introductionmentioning
confidence: 99%