2022
DOI: 10.3390/s22207994
|View full text |Cite
|
Sign up to set email alerts
|

Convolutional Long-Short Term Memory Network with Multi-Head Attention Mechanism for Traffic Flow Prediction

Abstract: Accurate predictive modeling of traffic flow is critically important as it allows transportation users to make wise decisions to circumvent traffic congestion regions. The advanced development of sensing technology makes big data more affordable and accessible, meaning that data-driven methods have been increasingly adopted for traffic flow prediction. Although numerous data-driven methods have been introduced for traffic flow predictions, existing data-driven methods cannot consider the correlation of the ext… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
8
2

Relationship

1
9

Authors

Journals

citations
Cited by 14 publications
(7 citation statements)
references
References 46 publications
0
5
0
Order By: Relevance
“…The decoder LSTM takes the encoding and generates the forecast, , where M is the number of steps to forecast. The decoder LSTM also has an attention mechanism which allows it to focus on different parts of the encoding at each time step and generate the forecast one step at a time [ 47 ]. At time step t , the decoder computes the attention weights over the encoding and calculates the context vector as a weighted sum of the encodings, as follows: …”
Section: Methodsmentioning
confidence: 99%
“…The decoder LSTM takes the encoding and generates the forecast, , where M is the number of steps to forecast. The decoder LSTM also has an attention mechanism which allows it to focus on different parts of the encoding at each time step and generate the forecast one step at a time [ 47 ]. At time step t , the decoder computes the attention weights over the encoding and calculates the context vector as a weighted sum of the encodings, as follows: …”
Section: Methodsmentioning
confidence: 99%
“…In order to make the constructed predictor better explore the influence mechanism of yield of molten iron and improve the prediction accuracy of the neural network, this article stacks multiple DAEs to build a deep neural network, so that the high-level hidden layer can learn more robust feature analysis capabilities, the process of which is shown in Figure 5. The attention mechanism can deeply explore the long-distance relationship between labels through the global interaction analysis of the data, which has advantages in the extraction of highdimensional features, is generally divided into self-attention mechanism 23) and multi-head attention mechanism 24) . Self-attention mechanism concentrates attention on the current location without perceiving other locations during the encoding process.…”
Section: đť‘“ đť‘™mentioning
confidence: 99%
“…Wei et al [ 26 ] proposed a model called AutoEncoder Long Short-Term Memory (AE-LSTM), which uses AutoEncoder to capture the internal relationship of the traffic flow by extracting the characteristics of upstream and downstream traffic flow data and employs LSTM to predict the complex linear traffic flow data. Wei et al [ 27 ] proposed a decoder convolutional LSTM model, where the convolutional operation is used to consider the correlation of the high-dimensional features, and the LSTM network is used to consider the temporal correlation of traffic flow data. Moreover, the multi-head attention mechanism is introduced to use the most relevant portion of the traffic data to improve the prediction performance.…”
Section: Related Workmentioning
confidence: 99%