2019
DOI: 10.48550/arxiv.1908.09710
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Variational Graph Recurrent Neural Networks

Ehsan Hajiramezanali,
Arman Hasanzadeh,
Nick Duffield
et al.

Abstract: Representation learning over graph structured data has been mostly studied in static graph settings while efforts for modeling dynamic graphs are still scant. In this paper, we develop a novel hierarchical variational model that introduces additional latent random variables to jointly model the hidden states of a graph recurrent neural network (GRNN) to capture both topology and node attribute changes in dynamic graphs. We argue that the use of high-level latent random variables in this variational GRNN (VGRNN… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
7
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(7 citation statements)
references
References 19 publications
0
7
0
Order By: Relevance
“…Another line of work on temporal (knowledge) graph reasoning uses message passing networks to capture intra-graph neighborhood information, which is sometimes combined with temporal recurrence or attention mechanisms (Manessi et al, 2020;Kumar et al, 2018;Pareja et al, 2019;Chen et al, 2018;Jin et al, 2019;Sankar et al, 2020;Hajiramezanali et al, 2019). Orthogonal to our work, Trivedi et al (2017Trivedi et al ( , 2019; Han et al (2020) explore using temporal point processes.…”
Section: Related Workmentioning
confidence: 94%
“…Another line of work on temporal (knowledge) graph reasoning uses message passing networks to capture intra-graph neighborhood information, which is sometimes combined with temporal recurrence or attention mechanisms (Manessi et al, 2020;Kumar et al, 2018;Pareja et al, 2019;Chen et al, 2018;Jin et al, 2019;Sankar et al, 2020;Hajiramezanali et al, 2019). Orthogonal to our work, Trivedi et al (2017Trivedi et al ( , 2019; Han et al (2020) explore using temporal point processes.…”
Section: Related Workmentioning
confidence: 94%
“…Since dynamic graph representations append the time dimension on the static ones, the RNN-based DGNNs [18,28] are considered to summarize temporal information over time. However, the computation of RNN-based DGNNs is expensive because RNNs need a large amount of graph data for training.…”
Section: Dynamic Graph Neural Networkmentioning
confidence: 99%
“…GGNN (Li et al, 2015) is another time-and-graph application where node attributes do not change, thus it does not have the GNN in part that most other time-and-graph methods have (see Equation ( 9)). There are also variational autoencoder works (Hajiramezanali et al, 2019) and generative adversarial works (Lei et al, 2019;Xiong et al, 2019), which are extensions of time-and-graph works introduced earlier, e.g., putting time-and-graph as the encoder of generative adversarial framework.…”
Section: E Related Workmentioning
confidence: 99%