2017
DOI: 10.1002/rsa.20713
|View full text |Cite
|
Sign up to set email alerts
|

Concentration and regularization of random graphs

Abstract: Abstract. This paper studies how close random graphs are typically to their expectations. We interpret this question through the concentration of the adjacency and Laplacian matrices in the spectral norm. We study inhomogeneous Erdös-Rényi random graphs on n vertices, where edges form independently and possibly with different probabilities pij. Sparse random graphs whose expected degrees are o(log n) fail to concentrate; the obstruction is caused by vertices with abnormally high and low degrees. We show that c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

2
124
1

Year Published

2017
2017
2021
2021

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 101 publications
(127 citation statements)
references
References 48 publications
(108 reference statements)
2
124
1
Order By: Relevance
“…To address this issue without making regularization procedure more complex, we are going to use an additional structural decomposition for Bernoulli random matrices, first shown in the work of Le, Levina and Vershynin [9]. The next proposition is a direct corollary of [9, Theorem 2.6]: Proposition 2.3 (Decomposition lemma).…”
Section: 3mentioning
confidence: 99%
See 1 more Smart Citation
“…To address this issue without making regularization procedure more complex, we are going to use an additional structural decomposition for Bernoulli random matrices, first shown in the work of Le, Levina and Vershynin [9]. The next proposition is a direct corollary of [9, Theorem 2.6]: Proposition 2.3 (Decomposition lemma).…”
Section: 3mentioning
confidence: 99%
“…The rest of the paper is structured as follows. The proof of Theorem 1.1 is based on the previously known regularization results developed for Bernoulli random matrices (mainly in the works of Feige and Ofek [6], and Le, Levina and Vershynin [9]). In Section 2 we review some results specific to the Bernoulli matrices and briefly explain how they will be used later in the text.…”
Section: Introductionmentioning
confidence: 99%
“…The problem of partitioning a graph into two well-connected subcommunities can be viewed as synchronization over the group f˙1g Š Z=2: each vertex has a latent group element g u 2 f˙1g, its community identity, and each edge is a noisy measurement of the relative status g u g 1 v [5]. A number of more structured approaches have been shown to improve over PCA, including modified spectral methods [37,38,44,50,59] and semidefinite programming [1,2,32,33,49]. However, this approach breaks down in sparse graphs: localized noise eigenvectors associated to high-degree vertices dominate the spectrum.…”
Section: Introductionmentioning
confidence: 99%
“…This is essentially a failure of PCA to adequately exploit the problem structure, as these localized eigenvectors lie far from the constraint that the truth is entrywise f˙1g. A number of more structured approaches have been shown to improve over PCA, including modified spectral methods [37,38,44,50,59] and semidefinite programming [1,2,32,33,49]. A major algorithmic challenge in this problem is to obtain an efficient algorithm that optimally exploits this structure, to obtain the minimum possible estimation error.…”
Section: Introductionmentioning
confidence: 99%
“…The problem of bounding the difference between eigenvalues of A and those of the adjacency matrix of G n (p ij ), together with its Laplacian spectra version, has been studied intensively recently; see, e.g., [3,8,9]. It is revealed in [9] that large deviation from the expected spectrum is caused by vertices with extremal degrees, where abnormally high-degree and low-degree vertices are obstructions to concentration of the adjacency and the Laplacian matrices, respectively. A regularization technique is employed to address this issue.…”
Section: Introductionmentioning
confidence: 99%