2023
DOI: 10.3390/su15021374
|View full text |Cite
|
Sign up to set email alerts
|

Short-Term Traffic Flow Prediction Based on the Optimization Study of Initial Weights of the Attention Mechanism

Abstract: Traffic-flow prediction plays an important role in the construction of intelligent transportation systems (ITS). So, in order to improve the accuracy of short-term traffic flow prediction, a prediction model (GWO-attention-LSTM) based on the combination of optimized attention mechanism and long short-term memory (LSTM) is proposed. The model is based on LSTM and uses the attention mechanism to assign individual weight to the feature information extracted via LSTM. This can increase the prediction model’s focus… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 20 publications
0
5
0
Order By: Relevance
“…Lan Tianhe et al proposed a prediction model (GWO Attention LSTM) based on the combination of the optimized attention mechanism and LSTM. The results indicate that the GWO attention LSTM model has good model performance and can provide effective assistance for traffic management control and traffic flow theory research [24]. Symmetrybased Bi-LSTM networks can overcome the drawback of one-way LSTM networks of being only able to learn unidirectional information [25,26].…”
Section: Literature Reviewmentioning
confidence: 90%
“…Lan Tianhe et al proposed a prediction model (GWO Attention LSTM) based on the combination of the optimized attention mechanism and LSTM. The results indicate that the GWO attention LSTM model has good model performance and can provide effective assistance for traffic management control and traffic flow theory research [24]. Symmetrybased Bi-LSTM networks can overcome the drawback of one-way LSTM networks of being only able to learn unidirectional information [25,26].…”
Section: Literature Reviewmentioning
confidence: 90%
“…Three gating units with different functions determine when the information flow in the LSTM module flows in, out, and is forgotten, respectively. As equation (21) shows, the forget gate takes x t and h t−1 as inputs and discards the information using the sigmoid function.…”
Section: Long Short Term Memory Networkmentioning
confidence: 99%
“…On the basis of this observation, Zhang et al [20] modified the genetic algorithm in order to accelerate the convergence of the optimized LSTM. The attention mechanism was added to the LSTM by Lan et al [21]. The Gray Wolf Optimization (GWO) method was used to change the initial weight values of the attention mechanism in order to increase the prediction model's attention to important information.…”
Section: Introductionmentioning
confidence: 99%
“…The attention mechanism in the Transformer model 28 was introduced in this paper to promote the capture of relationship features between temporal signals. This mechanism can assign a weight to the input by itself, thereby enabling the model to focus on the essential information of rolling bearing faults and improving the efficiency of feature extraction.…”
Section: Siamese Cnn-bilstm Model-based Quantitative Diagnosis Of Rol...mentioning
confidence: 99%