2023
DOI: 10.1002/ett.4789
|View full text |Cite
|
Sign up to set email alerts
|

Reducing hysteresis and over‐smoothing in traffic estimation: A multistream spatial‐temporal graph convolutional network

Abstract: Accurate traffic estimation contributes to safer route planning for Autonomous Vehicles (AVs). Currently, deep learning methods based on graph convolution networks (GCNs) and recurrent neural networks (RNNs) are widely used in traffic estimation. However, such methods suffer from spatial over‐smoothing and temporal hysteresis, which lead to estimation results deviating from the ground truth. Therefore, a multistream spatial‐temporal graph convolutional network (MSGCN) is proposed in this paper to deal with the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 43 publications
0
1
0
Order By: Relevance
“…In the paper by Yu et al, 3 common deep learning methods used in traffic estimation suffer from spatial over-smoothing and temporal hysteresis, which lead to estimation results deviating from the ground truth. Therefore, the authors propose a multi-stream spatial-temporal graph convolutional network (MSGCN), integrating local, global, and differential spatial-temporal features that are modeled from multiple dimensions.…”
Section: Papers In This Special Sectionmentioning
confidence: 99%
“…In the paper by Yu et al, 3 common deep learning methods used in traffic estimation suffer from spatial over-smoothing and temporal hysteresis, which lead to estimation results deviating from the ground truth. Therefore, the authors propose a multi-stream spatial-temporal graph convolutional network (MSGCN), integrating local, global, and differential spatial-temporal features that are modeled from multiple dimensions.…”
Section: Papers In This Special Sectionmentioning
confidence: 99%