2019 IEEE International Conference on Big Data (Big Data) 2019
DOI: 10.1109/bigdata47090.2019.9006204
|View full text |Cite
|
Sign up to set email alerts
|

Towards Interpretable Graph Modeling with Vertex Replacement Grammars

Abstract: An enormous amount of real-world data exists in the form of graphs. Oftentimes, interesting patterns that describe the complex dynamics of these graphs are captured in the form of frequently reoccurring substructures. Recent work at the intersection of formal language theory and graph theory has explored the use of graph grammars for graph modeling and pattern mining. However, existing formulations do not extract meaningful and easily interpretable patterns from the data. The present work addresses this limita… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 32 publications
0
4
0
Order By: Relevance
“…A recent addition to the class of graph generators are hyperedge [9,45] and node replacement [7,8] grammars. Graph grammars contain graphical rewriting rules that match and replace graph fragments, similar to how a context-free string grammar rewrites characters in a string.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…A recent addition to the class of graph generators are hyperedge [9,45] and node replacement [7,8] grammars. Graph grammars contain graphical rewriting rules that match and replace graph fragments, similar to how a context-free string grammar rewrites characters in a string.…”
Section: Related Workmentioning
confidence: 99%
“…1(A). Attributed Vertex Replacement Grammar (AVRG) builds on recent advances in graph grammars [7][8][9], which were designed 1 The source code is available at https://github.com/satyakisikdar/Attributed-VRG only to model and generate homogeneous graphs, to handle attributed graphs, increasing its applicability to a broad class of modern problems.…”
Section: Introductionmentioning
confidence: 99%
“…Since SBMs' introduction, they have been extended to handle edge-weighted [14], bipartite [15], temporal [16], and hierarchical networks [17]. Likewise, Exponential Random Graph Models (ERGMs) [18], Kronecker graph models [19,20,21], and graph grammar models [22,23,24] are able to generate graphs that are more-or-less faithful to the source graph.…”
Section: Graph Modelsmentioning
confidence: 99%
“…These include Chung Lu model [2], clustering-based node replacement graph grammars (CNRG) [23], block two-level Erdős Réyni (BTER) [41], degree-corrected stochastic block models (SBM) [42], hyperedge replacement graph grammars (HRG) [43,22], Kronecker graphs [19,20], bottom-up graph grammar extractor (BUGGE) [24], generative adversarial network (NetGAN) [30], graph linear autoencoder (LinearAE) [26], graph convolutional neural networks (GCNAE) [27], and graph recurrent neural networks (GraphRNN) [25]. Random graphs generated using the Erdős-Rényi model with an edge probability equal to the density of the input graph are also included as a baseline.…”
Section: Graph Modelsmentioning
confidence: 99%