2020 IEEE 32nd International Conference on Tools With Artificial Intelligence (ICTAI) 2020
DOI: 10.1109/ictai50040.2020.00114
|View full text |Cite
|
Sign up to set email alerts
|

ST-MGAT: Spatial-Temporal Multi-Head Graph Attention Networks for Traffic Forecasting

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 6 publications
0
2
0
Order By: Relevance
“…A machine learning method for regression problems. FC‐LSTM [50]: Long short‐term memory network, a sequence‐to‐sequence model based on the recurrent neural network with fully‐connected LSTM hidden units. DCRNN [49]: Diffusion convolutional recurrent neural network, which uses diffusion graph convolutional networks and the encoder‐decoder structure to capture the spatial and temporal dependencies, respectively. STGCN [6]: Spatio‐temporal graph convolution networks, which integrate the gated temporal convolution unit into graph convolution blocks. GraphWaveNet [17]: Graph WaveNet combines an adaptive adjacency matrix into graph convolution and utilizes stacked 1D convolution units to capture the spatial‐temporal dependency. STSGCN [14]: Spatial‐temporal synchronous graph convolutional networks, which effectively capture the localized and long‐range spatial‐temporal dependencies through a spatial‐temporal synchronous modeling mechanism. GMAN [9]: Graph multi‐attention network, an encoder‐decoder architecture with multiple spatial‐temporal attention blocks. The transform attention layers enable modeling the impact of the spatio‐temporal factors on complex traffic conditions. ST‐MGAT [51]: Spatial‐temporal multi‐head graph attention network, which consists of temporal convolution blocks and graph attention networks for capturing the dynamic temporal correlations and spatial relations between nodes, respectively.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…A machine learning method for regression problems. FC‐LSTM [50]: Long short‐term memory network, a sequence‐to‐sequence model based on the recurrent neural network with fully‐connected LSTM hidden units. DCRNN [49]: Diffusion convolutional recurrent neural network, which uses diffusion graph convolutional networks and the encoder‐decoder structure to capture the spatial and temporal dependencies, respectively. STGCN [6]: Spatio‐temporal graph convolution networks, which integrate the gated temporal convolution unit into graph convolution blocks. GraphWaveNet [17]: Graph WaveNet combines an adaptive adjacency matrix into graph convolution and utilizes stacked 1D convolution units to capture the spatial‐temporal dependency. STSGCN [14]: Spatial‐temporal synchronous graph convolutional networks, which effectively capture the localized and long‐range spatial‐temporal dependencies through a spatial‐temporal synchronous modeling mechanism. GMAN [9]: Graph multi‐attention network, an encoder‐decoder architecture with multiple spatial‐temporal attention blocks. The transform attention layers enable modeling the impact of the spatio‐temporal factors on complex traffic conditions. ST‐MGAT [51]: Spatial‐temporal multi‐head graph attention network, which consists of temporal convolution blocks and graph attention networks for capturing the dynamic temporal correlations and spatial relations between nodes, respectively.…”
Section: Methodsmentioning
confidence: 99%
“…The transform attention layers enable modeling the impact of the spatio-temporal factors on complex traffic conditions. • ST-MGAT [51]: Spatial-temporal multi-head graph attention network, which consists of temporal convolution blocks and graph attention networks for capturing the dynamic temporal correlations and spatial relations between nodes, respectively.…”
Section: Baseline Descriptionmentioning
confidence: 99%
“…Using a matrix factorization technique, Bai et al [33] suggested a convolutional GNN module that can apply node specific parameters. Attentional GNNs also have been widely used in traffic forecasting research [21], [26], [27], [36], [41], [45], [52], [66], [73]. The gated attention networks (GaAN) [26] outperformed diffusion convolution in short-term traffic forecasting when combined with GRU.…”
Section: B Spatial Feature Extraction With Graph Neural Networkmentioning
confidence: 99%
“…The gated attention networks (GaAN) [26] outperformed diffusion convolution in short-term traffic forecasting when combined with GRU. GAT [79] has also been adopted in many studies [21], [27], [36], [41], [52], [73]. Park et al [66] constructed a new attentional GNN layer that adopts the scaled dot-product attention [61] with sentinel vectors to control the information from neighbor nodes.…”
Section: B Spatial Feature Extraction With Graph Neural Networkmentioning
confidence: 99%
“…STGCN [18], AGCRN [35], and the proposed model GAGW in this paper belong to spatiotemporal graph models. On the other hand, based on different time layer processing strategies, models can be further categorized into RNN-based prediction models led by RNN and DCRNN [21], TCN models such as GWNET [36] and STMGAT [37], and self-attention-mechanism-based models such as STTN [38].…”
Section: Baseline and Metricmentioning
confidence: 99%