2018
DOI: 10.1093/comnet/cny013
|View full text |Cite
|
Sign up to set email alerts
|

Switch chain mixing times and triangle counts in simple random graphs with given degrees

Abstract: Sampling uniform simple graphs with power-law degree distributions with degree exponent τ ∈ (2, 3) is a non-trivial problem. We propose a method to sample uniform simple graphs that uses a constrained version of the configuration model together with a Markov Chain switching method. We test the convergence of this algorithm numerically in the context of the presence of small subgraphs. We then compare the number of triangles in uniform random graphs with the number of triangles in the erased configuration model… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
10
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 8 publications
(11 citation statements)
references
References 39 publications
1
10
0
Order By: Relevance
“…Note that 1 − e −x x/(1 + x) for all x > 0. Interestingly, this implies that the erased configuration model contains more triangles than the uniform random graph with the same degree sequence, even though edges are removed in the erased configuration model, due to the presence of multiple edges and loops in the configuration model, which was empirically observed in [1]. This is because the large-degree vertices in the uniform random graph model have more low-degree neighbours than in the erased configuration model.…”
Section: Number Of Trianglesmentioning
confidence: 87%
See 1 more Smart Citation
“…Note that 1 − e −x x/(1 + x) for all x > 0. Interestingly, this implies that the erased configuration model contains more triangles than the uniform random graph with the same degree sequence, even though edges are removed in the erased configuration model, due to the presence of multiple edges and loops in the configuration model, which was empirically observed in [1]. This is because the large-degree vertices in the uniform random graph model have more low-degree neighbours than in the erased configuration model.…”
Section: Number Of Trianglesmentioning
confidence: 87%
“…Then the number of 2-paths from any specified vertex is bounded by 1) by (1.4). Since τ ∈ (2, 3) the above is o(n).…”
Section: Connection Probability Estimatesmentioning
confidence: 99%
“…We also note some additional works that are lessdirectly related to ours. Network-rewiring and MCMC algorithms are widely used to sample static networks [156][157][158][159][160]; in stationarity, these can be viewed as temporal networks satisfying the Equilibrium Property, with a level of persistence tunable via the number of iterations between adjacent snapshots. Adaptive network models (for instance, SIS-dynamics [161] alongside contactswitching [23,24]), have dynamic node-properties that evolve with time and guide network evolution, a commonality with THVMs.…”
Section: Related Workmentioning
confidence: 99%
“…Due to this feature the configuration model is widely used to analyze the influence of degrees on other properties or processes on networks [11,14,15,25,32,38]. An important property that many networks share is that their degree distributions are regularly varying, with the exponent γ of the degree distribution satisfying γ ∈ (1,2), so that the degrees have infinite variance. In this regime of degrees, the configuration model results in a simple graph with vanishing probability.…”
Section: Motivationmentioning
confidence: 99%
“…This is important because very commonly in the literature, various quantities measured in real-world networks are compared to null-models with same degrees but random rewiring. These rewired null-models are similar to a version of the inhomogeneous random graph [2,12]. Without knowing the scaling of these quantities in the inhomogeneous random graph, it is not possible to asses how similar a small measured value on the real network is to that of the null model.…”
Section: Motivationmentioning
confidence: 99%