2020
DOI: 10.1080/10618600.2020.1740714
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian Variational Inference for Exponential Random Graph Models

Abstract: Deriving Bayesian inference for exponential random graph models (ERGMs) is a doubly intractable problem as the normalizing constants of both the likelihood and posterior density are intractable. Markov chain Monte Carlo (MCMC) methods which yield Bayesian inference for ERGMs, such as the exchange algorithm, are asymptotically exact but computationally intensive, as a network has to be drawn from the likelihood at every step using a "tie no tie" sampler or some other algorithm. In this article, we develop a var… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
7
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(8 citation statements)
references
References 47 publications
1
7
0
Order By: Relevance
“…It may be possible to reduce the number of ERGM simulations at each MCMC iteration using noisy Monte Carlo methods (Alquier et al, 2016). Another promising avenue is variational inference for ERGMs (Tan and Friel, 2020), which could be extended to our framework to yield approximate Bayesian inference at a much reduced computational cost relative to MCMC.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…It may be possible to reduce the number of ERGM simulations at each MCMC iteration using noisy Monte Carlo methods (Alquier et al, 2016). Another promising avenue is variational inference for ERGMs (Tan and Friel, 2020), which could be extended to our framework to yield approximate Bayesian inference at a much reduced computational cost relative to MCMC.…”
Section: Discussionmentioning
confidence: 99%
“…The algorithm targets an augmented posterior π(θ, θ , y y y |y y y) ∝ π(θ|y y y)h(θ |θ)π(y y y |θ ) (5) where π(θ|y y y) is the original (target) posterior, h(θ |θ) is an arbitrary, normalisable proposal function, and π(y y y |θ) is the likelihood of the auxiliary variable. For simplicity, we assume h(θ |θ) to be symmetric.…”
Section: Posterior Computationmentioning
confidence: 99%
“…This cannot be done by brute force except for trivially small graphs (n 7), and the roughness of the underlying function precludes simple Monte Carlo strategies; thus, alternative approaches that approximate or avoid this calculation are of substantial interest. To date, the most frequently used approaches include: maximum pseudo-likelihood estimation (MPLE; Besag (1974)) adapted by Strauss and Ikeda (1990); Markov Chain Monte Carlo MLE (MCMC MLE; Geyer and Thompson (1992)) by Handcock (2003); Hunter and Handcock (2006); approximate MLE based on Stochastic approximation (SA; Robbins and Monro (1951); Pflug (1996)) by Snijders (2002); fully Bayesian inference based on exchange algorithm (Caimo and Friel, 2011); approximate Bayesian inference based on adjusted pseudolikelihood (Bouranis et al, 2017); and variational Bayesian inference based on fully adjusted pseudolikelihood (Tan and Friel, 2020). As simulation-based MLE-finding algorithms (e.g., MCMC MLE, SA) rely on good initial parameter configuration to seed their simulations, there are also some work on this aspect, including the partial stepping technique (Hummel et al, 2012) and the contrastive divergence (CD, Hinton (2002))-based techniques adapted to ERGMs by Krivitsky (2017).…”
Section: Definition and Estimationmentioning
confidence: 99%
“…Bouranis et al (2018) proposed an adjusted pseudolikelihood for correcting the mode, curvature and magnitude of the pseudolikelihood, and their simulation studies show that the adjusted pseudolikelihood can provide an accurate approximation to the true likelihood in the presence of strong dyadic dependence (where pseudolikelihood falls short). Building upon adjusted pseudolikelihood approximation, Tan and Friel (2020) developed a variety of variational methods for Gaussian approximation of the posterior density and model selection, which is shown to yield comparable performance to that of the exchange algorithm. The adjusted pseudolikelihood is defined as follows,…”
Section: Pseudolikelihoodmentioning
confidence: 99%
“…Bayesian inference on networks has seen great progress in areas such as estimation (Hoff et al, 2002;Handcock et al, 2007), exponential random graph models (Caimo and Friel, 2011;Thiemichen et al, 2016;Tan and Friel, 2020) and community detection (Psorakis et al, 2011;Mørup and Schmidt, 2012;van der Pas and van der Vaart, 2018). Bayesian methods for CP structure, however, are far less developed.…”
Section: Bayesianmentioning
confidence: 99%