2018
DOI: 10.3150/17-bej933
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive confidence sets for matrix completion

Abstract: In the present paper we study the problem of existence of honest and adaptive confidence sets for matrix completion. We consider two statistical models: the trace regression model and the Bernoulli model. In the trace regression model, we show that honest confidence sets that adapt to the unknown rank of the matrix exist even when the error variance is unknown. Contrary to this, we prove that in the Bernoulli model, honest and adaptive confidence sets exist only when the error variance is known a priori. In th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
19
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 17 publications
(20 citation statements)
references
References 35 publications
1
19
0
Order By: Relevance
“…The opening question is to choose the parameters of interest for the investigation. In [11] and [13], the authors proposed confidence regions of the matrix M with respect to the matrix Frobenius norm. In [5] and [12], the confidence intervals for M 's entries are established.…”
Section: Background and Motivationmentioning
confidence: 99%
“…The opening question is to choose the parameters of interest for the investigation. In [11] and [13], the authors proposed confidence regions of the matrix M with respect to the matrix Frobenius norm. In [5] and [12], the confidence intervals for M 's entries are established.…”
Section: Background and Motivationmentioning
confidence: 99%
“…For example Davenport et al (2014) study the problem of one-bit matrix completion, where the response is binary and the entrywise loss is logistic, with an additional ∞ -norm on the entries of the matrix, and Lafond (2015) study prediction error bounds for matrix completion for exponential family models with a nuclear norm penalty. Carpentier et al (2016) discuss confidence sets for the low-rank matrix completion problem and Klopp et al (2015) consider a multinomial matrix completion problem where the observed entries are quantized with a few levels (in their framework the missingness need not be uniform). They study a regularized negative log-likelihood problem, where the latent variables are regularized by a nuclear norm penalty and an additional constraint on the maximal absolute entries of the matrix.…”
Section: Nuclear Norm Regularizationmentioning
confidence: 99%
“…It is an important issue in applications to be able to say from the observations how well the recovery procedure has worked or, in the sequential sampling setting, to be able to give data-driven stopping rules that guarantee the recovery of the matrix M 0 at a given precision. This fundamental statistical question was recently studied in [7] where two statistical models for matrix completion are considered: the trace regression model and the Bernoulli model (for details see Section 1.1). In particular, in [7], the authors show that in the case of unknown noise variance, the information-theoretic structure of these two models is fundamentally different.…”
Section: Introductionmentioning
confidence: 99%
“…This fundamental statistical question was recently studied in [7] where two statistical models for matrix completion are considered: the trace regression model and the Bernoulli model (for details see Section 1.1). In particular, in [7], the authors show that in the case of unknown noise variance, the information-theoretic structure of these two models is fundamentally different.…”
Section: Introductionmentioning
confidence: 99%