2020
DOI: 10.48550/arxiv.2008.10097
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Testing correlation of unlabeled random graphs

Abstract: We study the problem of detecting the edge correlation between two random graphs with n unlabeled nodes. This is formalized as a hypothesis testing problem, where under the null hypothesis, the two graphs are independently generated; under the alternative, the two graphs are edge-correlated under some latent node correspondence, but have the same marginal distributions as the null. For both Gaussian-weighted complete graphs and dense Erdős-Rényi graphs (with edge probability n −o(1) ), we determine the sharp t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
20
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(21 citation statements)
references
References 42 publications
1
20
0
Order By: Relevance
“…As the first step, we leverage the previous truncated second moment computation in [WXY20] to conclude that the KL-divergence D(P Q) is negligible under the desired conditions. By expressing the mutual information as I(π; A, B) = n 2 D(P Q) − D(P Q) where D(P Q) = I(P ), this readily implies that I(π; A, B) = n 2 I(P )(1 + o(1)).…”
Section: Negative Results On Partial Recoverymentioning
confidence: 99%
See 2 more Smart Citations
“…As the first step, we leverage the previous truncated second moment computation in [WXY20] to conclude that the KL-divergence D(P Q) is negligible under the desired conditions. By expressing the mutual information as I(π; A, B) = n 2 D(P Q) − D(P Q) where D(P Q) = I(P ), this readily implies that I(π; A, B) = n 2 I(P )(1 + o(1)).…”
Section: Negative Results On Partial Recoverymentioning
confidence: 99%
“…This method, however, is often too loose as the second moment can be derailed by rare events. Thus a more robust version is by means of truncated second moment, which has been carried out in [WXY20] to bound TV (P A,B , Q A,B ) for studying the hypothesis testing problem in graph matching. Here we leverage the same result to bound the KL divergence.…”
Section: Proof Of Propositionmentioning
confidence: 99%
See 1 more Smart Citation
“…In particular, Theorem 1.1 solves [43, Section 6, Open Problem 3]. It is worth emphasizing that (1.3) for α = 1 with np → ∞ was already proved in [43] (which even allows ǫ → 0 as long as ǫ ≫ n −1/3 ). While our method should also be able to give (1.3) for α = 1, we chose to exclude this case since the assumption α < 1 allows to avoid some technical complications.…”
Section: Introductionmentioning
confidence: 82%
“…Our work is closely related to and much inspired by a recent work [43], where a sharp threshold was established for α = 0 and upper and lower bounds on λ * up to a constant factor were established for α ∈ (0, 1]. In particular, Theorem 1.1 solves [43, Section 6, Open Problem 3].…”
Section: Introductionmentioning
confidence: 93%