2020
DOI: 10.1109/jsait.2020.3040598
|View full text |Cite
|
Sign up to set email alerts
|

Information-Theoretic Limits for the Matrix Tensor Product

Abstract: This paper studies a high-dimensional inference problem involving the matrix tensor product of random matrices. This problem generalizes a number of contemporary data science problems including the spiked matrix models used in sparse principal component analysis and covariance estimation and the stochastic block model used in network analysis. The main results are single-letter formulas (i.e., analytical expressions that can be approximated numerically) for the mutual information and the minimum mean-squared e… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
30
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 21 publications
(30 citation statements)
references
References 55 publications
0
30
0
Order By: Relevance
“…This ensures the existence of a unique global solution over [0, 1] to the system of ODEs (68)-(69). Moreover, the latter implies also that the map −→ Q (•) is still regular, because F is positive as proved in Lemma 2 and ∂E m e t ∂ Q ,r ≥ 0 thanks again to (28). This guarantees the positivity of the trace in (64) and forces the vanishing of the remainder R in -average by Lemma 4.…”
Section: Proof Of Theoremmentioning
confidence: 69%
See 4 more Smart Citations
“…This ensures the existence of a unique global solution over [0, 1] to the system of ODEs (68)-(69). Moreover, the latter implies also that the map −→ Q (•) is still regular, because F is positive as proved in Lemma 2 and ∂E m e t ∂ Q ,r ≥ 0 thanks again to (28). This guarantees the positivity of the trace in (64) and forces the vanishing of the remainder R in -average by Lemma 4.…”
Section: Proof Of Theoremmentioning
confidence: 69%
“…After the completion of this work, paper [28] was brought to our attention where the mutual information for a wide class of inference problems is solved by means of a variational principle. While it is possible to obtain our model as an instance of the one considered there, the variational principle presented has no clear correspondence to ours.…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations