2022
DOI: 10.1145/3470889
|View full text |Cite
|
Sign up to set email alerts
|

Graph Sequence Neural Network with an Attention Mechanism for Traffic Speed Prediction

Abstract: Recent years have witnessed the emerging success of Graph Neural Networks (GNNs) for modeling graphical data. A GNN can model the spatial dependencies of nodes in a graph based on message passing through node aggregation. However, in many application scenarios, these spatial dependencies can change over time, and a basic GNN model cannot capture these changes. In this article, we propose a G raph S eq … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 11 publications
(8 citation statements)
references
References 29 publications
0
8
0
Order By: Relevance
“…Each layer in the multi-layer network represents a system or a subsystem, and the connection between the networks is realized based on the relationship between the actual system's layers [34,35]. Each layer in the multi-layer network may exhibit different local and global structural characteristics than those in the single-layer network, which provides an effective method for analyzing the interaction between different systems and the formation mechanism of the network [36][37][38].…”
Section: Multi-layer Networkmentioning
confidence: 99%
“…Each layer in the multi-layer network represents a system or a subsystem, and the connection between the networks is realized based on the relationship between the actual system's layers [34,35]. Each layer in the multi-layer network may exhibit different local and global structural characteristics than those in the single-layer network, which provides an effective method for analyzing the interaction between different systems and the formation mechanism of the network [36][37][38].…”
Section: Multi-layer Networkmentioning
confidence: 99%
“…Attention mechanisms typically involve a scoring mechanism that calculates the relevance or attention weight for each time step, followed by a weighted combination of the time step representations to produce a context vector that is then used for making predictions or further processing. Attention mechanisms in time series analysis have shown effectiveness in tasks such as sequence classification, forecasting, and anomaly detection, as they enable models to selectively attend to relevant temporal information while disregarding irrelevant or noisy segments of the time series [63][64][65].…”
Section: Attentionmentioning
confidence: 99%
“…The Graph Neural Network family as a whole includes a broad range of subjects, including as reinforcement learning, unsupervised learning, supervised learning, and semi-supervised learning. For instance, a model proposed by Lu et al [25] based on GNNs has the capability of mapping a graph G, including all its nodes, to an m-dimensional Euclidean space with n ∈ N. On the other hand, Reza et al [26] proposed a model that is an extension of Lu et al's [25], in which the authors used gated recurrent elements for making output GNN sequences. Bogaerts et al [27] proposed a GCNN model by employing CNNs for handling graph data in addition to a variety of alternative GNNs.…”
Section: Graph Neural Network-based Approachesmentioning
confidence: 99%