2022
DOI: 10.48550/arxiv.2206.03469
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

FDGNN: Fully Dynamic Graph Neural Network

Abstract: Dynamic Graph Neural Networks recently became more and more important as graphs from many scientific fields, ranging from mathematics, biology, social sciences, and physics to computer science, are dynamic by nature. While temporal changes (dynamics) play an essential role in many real-world applications, most of the models in the literature on Graph Neural Networks (GNN) process static graphs. The few GNN models on dynamic graphs only consider exceptional cases of dynamics, e.g., node attribute-dynamic graphs… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 3 publications
0
3
0
Order By: Relevance
“…We find that the dynamic GNN models [20,27,28,34,45,51,57,63,79] designed for graph streams adopt the traditional batched training mode to learn parameters. Each batch consists of a sequence of historical events {𝑒 [1], 𝑒 [2], .…”
Section: Dynamic Gnn Training Abstractionmentioning
confidence: 99%
See 1 more Smart Citation
“…We find that the dynamic GNN models [20,27,28,34,45,51,57,63,79] designed for graph streams adopt the traditional batched training mode to learn parameters. Each batch consists of a sequence of historical events {𝑒 [1], 𝑒 [2], .…”
Section: Dynamic Gnn Training Abstractionmentioning
confidence: 99%
“…However, real-world graphs are inherently dynamic and evolving over time. Recently, many dynamic GNN models [20,27,28,34,45,51,57,63,79] are emerged as a promising method for learning from dynamic graphs. These models capture both the spatial and temporal information, which makes them outperform traditional GNNs in real-time applications, such as real-time fraud detection [57], real-time recommendation [20], and many other tasks.…”
Section: Introductionmentioning
confidence: 99%
“…Concurrently, another investigative work [13] harnesses the Hawkes process, emphasizing the modeling of dynamic link additions, with a concentrated focus on their temporal evolution. Conversely, FDGNN [14] introduces an avant-garde framework dedicated to node and edge embeddings. This method showcases proficiency in capturing Temporal Point Processes, hinting at the potential for devising encodings symbiotic with incoming graph events.…”
Section: Related Workmentioning
confidence: 99%