2019 International Joint Conference on Neural Networks (IJCNN) 2019
DOI: 10.1109/ijcnn.2019.8852131
|View full text |Cite
|
Sign up to set email alerts
|

Autoregressive Models for Sequences of Graphs

Abstract: This paper proposes an autoregressive (AR) model for sequences of graphs, which generalises traditional AR models. A first novelty consists in formalising the AR model for a very general family of graphs, characterised by a variable topology, and attributes associated with nodes and edges. A graph neural network (GNN) is also proposed to learn the AR function associated with the graph-generating process (GGP), and subsequently predict the next graph in a sequence. The proposed method is compared with four base… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 21 publications
0
5
0
Order By: Relevance
“…In this sense, the stable convergence in distribution paves the way towards an extension to a random and/or time-dependent adjacency matrix (e.g. as an additional, potentially non-ergodic, process) [51,5].…”
Section: With An Unknown Adjacency Matrixmentioning
confidence: 97%
“…In this sense, the stable convergence in distribution paves the way towards an extension to a random and/or time-dependent adjacency matrix (e.g. as an additional, potentially non-ergodic, process) [51,5].…”
Section: With An Unknown Adjacency Matrixmentioning
confidence: 97%
“…We start by assessing the performance of hybrid global-local spatiotemporal models in a controlled environment, considering a variation of GPVAR, a synthetic dataset based on a polynomial graph filter [45] introduced by Zambon and Alippi [46], that we modify to include local effects. In particular, data are generated from the spatiotemporal process…”
Section: Synthetic Datamentioning
confidence: 99%
“…Probabilistic models As per any machine learning problem, dealing with uncertainty associated with the data-generating process and the learned model can be the key to accurate and trustworthy DGN solutions. Despite the recent progress in integrating DGNs with probabilistic components [28,29,30], estimating the uncertainty associated with combinatorial objects like the adjacency matrix and estimating non-factorized probability distribution over graphs are two of the most challenging problems.…”
Section: Promising Directionsmentioning
confidence: 99%