2021
DOI: 10.48550/arxiv.2104.14917
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Dynamic Graph Convolutional Recurrent Network for Traffic Prediction: Benchmark and Solution

Abstract: Traffic prediction is the cornerstone of intelligent transportation system. Accurate traffic forecasting is essential for the applications of smart cities, i.e., intelligent traffic management and urban planning. Although various methods are proposed for spatio-temporal modeling, they ignore the dynamic characteristics of correlations among locations on road network. Meanwhile, most Recurrent Neural Network (RNN) based works are not efficient enough due to their recurrent operations. Additionally, there is a s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
25
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(25 citation statements)
references
References 25 publications
0
25
0
Order By: Relevance
“…• DGCRN [38]: The model proposes a supernetwork with adaptive stepby-step generation of dynamic adjacency matrices, which significantly improves the forecasting performance.…”
Section: Baseline Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…• DGCRN [38]: The model proposes a supernetwork with adaptive stepby-step generation of dynamic adjacency matrices, which significantly improves the forecasting performance.…”
Section: Baseline Methodsmentioning
confidence: 99%
“…GWNET-conv [37] introduces a new covariance loss on Graph WaveNet to significantly improve the forecasting accuracy of Graph WaveNet. DGCRN [38] uses a supernetwork to generate the adjacency matrix and merge it with the original road network matrix to capture spatial correlations dynamically. STGNN [39] model provides a learnable spatial graph neural network of location attention mechanisms and captures local and global temporal correlations using GRU and transformer layers.…”
Section: Related Workmentioning
confidence: 99%
“…S 𝑖 𝑡𝑟𝑎𝑖𝑛 is static for all samples during training, helping to make the training process more robust and accurate. The feature H 𝑖 is dynamic for different training samples to reflect the dynamics of dependency graphs [19]. As such, we use the cross-entropy between Θ and the 𝑘NN graph 𝐴 𝑎 as graph structure regularization:…”
Section: The Forecasting Stagementioning
confidence: 99%
“…Inductive Adaptive Graph Generation To model the hidden relations among nodes, (Wu et al 2019;Bai et al 2020;Wu et al 2020) learn a static adaptive adjacency matrix but disregard the dynamic dependencies. Later, (Li et al 2021) handle the dynamic relations by learning the matrix at each recurrent step. However, these methods significantly depend on the node embedding layer which is not available in the inductive settings.…”
Section: Adaptive Skip Graph Gated Recurrent Unitmentioning
confidence: 99%
“…For instance, (Wu et al 2020) propose a dilated inception temporal convolution to discover relations with different scales and enlarge receptive fields of the model for long sequence modeling. To address hidden relations among sensors, (Li et al 2021;Bai et al 2020) further propose graph generation modules that learn an adaptive adjacency matrix to describe the hidden relation strength between two nodes. However, these models are based on transductive learning, which cannot handle a variable number of input nodes and are less explored for inference problems.…”
Section: Spatiotemporal Graph Neural Networkmentioning
confidence: 99%