2017
DOI: 10.4310/sii.2017.v10.n2.a15
|View full text |Cite
|
Sign up to set email alerts
|

A local structure model for network analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
23
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(23 citation statements)
references
References 0 publications
0
23
0
Order By: Relevance
“…Markov structures may arise with exponential random graph models (e.g., Kolaczyk 2009) and latent variable models (Hoff, Raftery, and Handcock 2002) as two common types of probabilistic models for networks; see Hunter, Krivitsky, and Schweinberger (2012) and references therein. Local structure graph models provide another MRF modeling approach for random graphs, whereby each graph edge has a formulated conditional distribution that depends on neighborhoods of other graph edges; see Casleton, Nordman, and Kaiser (2017). We next examine conclique-based simulation from one such model.…”
Section: Simulation Of a Large Networkmentioning
confidence: 99%
“…Markov structures may arise with exponential random graph models (e.g., Kolaczyk 2009) and latent variable models (Hoff, Raftery, and Handcock 2002) as two common types of probabilistic models for networks; see Hunter, Krivitsky, and Schweinberger (2012) and references therein. Local structure graph models provide another MRF modeling approach for random graphs, whereby each graph edge has a formulated conditional distribution that depends on neighborhoods of other graph edges; see Casleton, Nordman, and Kaiser (2017). We next examine conclique-based simulation from one such model.…”
Section: Simulation Of a Large Networkmentioning
confidence: 99%
“…As shown in Casleton et al (2014b), a correspondence between ERGMs and LSGMs is demonstrated through the negpotential function. The joint distribution for both is that of a Gibbs distribution, which can be specified through the negpotential function, up to a constant.…”
Section: Higher-order Dependencementioning
confidence: 89%
“…Network science as a field is sizable, diverse, and rapidly growing. Both the traditional formulations of ERGMs and LSGMs have joint distributions in Gibbsian form (Casleton et al, 2014b), but these joint distributions must be constructed for a LSGM from the set of specified conditional distributions (Kaiser and Cressie, 2000), which may be accomplished under certain conditions.…”
Section: Introductionmentioning
confidence: 99%
“…For even a graph with a small number of nodes, there can be many edges, meaning that the single-site (sequential) Gibbs sampler would be computationally time demanding but the conclique-based approach could be applied to reduce the number of computations in each iteration. For example, in the graph models of (Casleton, Nordman, and Kaiser 2017), there is a geographic notion (a radius of influence) whereby nodes in the graph which are far apart in distance (or other covariates) will not share an edge. It follows that collections of nodes which are separated by a distance under the MRF model can be used to define concliques for edges in the graph (observations as edges which do not neighbor other edges in the graph).…”
Section: Discussionmentioning
confidence: 99%
“…Such MRF models have become popular for modeling temporally-or spatially-dependent areal data (Cressie 1993), image segmentation (Zhang, Brady, and Smith 2001), computer vision (S. Z. Li 2012), and positron emission tomography (Higdon 1998), among other challenging applications including the analysis of networks (cf. Strauss and Ikeda 1990;Hoff, Raftery, and Handcock 2002;Casleton, Nordman, and Kaiser 2017). In addition to providing a route for model formulation, another reason for the popularity of MRF specifications is that observation-wise conditional distributions fit naturally with the Gibbs sampler (S. Geman and Geman 1984) for generating data realizations via Markov chain Monte Carlo (MCMC) methods (Gelfand and Smith 1990).…”
Section: Introductionmentioning
confidence: 99%