2023
DOI: 10.48550/arxiv.2302.01018
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Graph Neural Networks for temporal graphs: State of the art, open challenges, and opportunities

Abstract: Graph Neural Networks (GNNs) have become the leading paradigm for learning on (static) graphstructured data. However, many real-world systems are dynamic in nature, since the graph and node/edge attributes change over time. In recent years, GNNbased models for temporal graphs have emerged as a promising area of research to extend the capabilities of GNNs. In this work, we provide the first comprehensive overview of the current state-of-the-art of temporal GNN, introducing a rigorous formalization of learning s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 43 publications
(70 reference statements)
0
2
0
Order By: Relevance
“…These include univariate time series models [24,25], similarity-based methods [26], probabilistic generative models [27][28][29][30], and matrix and tensor factorization [31]. Nevertheless, with the success of deep learning and representation learning on static graphs, recent approaches have shifted towards using neural networks, as surveyed by Kazemi et al [9] and Longa et al [10]. Notably, memory-based dynamic graph neural networks (DGNNs) such as TGN [21] and DyRep [32], use an encoder-decoder architecture.…”
Section: Methods For Dlp (Dynamic) Linkmentioning
confidence: 99%
See 1 more Smart Citation
“…These include univariate time series models [24,25], similarity-based methods [26], probabilistic generative models [27][28][29][30], and matrix and tensor factorization [31]. Nevertheless, with the success of deep learning and representation learning on static graphs, recent approaches have shifted towards using neural networks, as surveyed by Kazemi et al [9] and Longa et al [10]. Notably, memory-based dynamic graph neural networks (DGNNs) such as TGN [21] and DyRep [32], use an encoder-decoder architecture.…”
Section: Methods For Dlp (Dynamic) Linkmentioning
confidence: 99%
“…The resulting types of data are commonly referred to as Continuous-Time Dynamic Graphs (CTDGs) [8]. Modeling and forecasting CTDGs have recently become very active fields of research, as suggested by recent surveys [9,10]. A crucial task of interest is Dynamic Link Prediction (DLP), where the goal is to predict future links from a history of observed ones.…”
Section: Introductionmentioning
confidence: 99%