2023
DOI: 10.1049/itr2.12433
|View full text |Cite
|
Sign up to set email alerts
|

A parking occupancy prediction method incorporating time series decomposition and temporal pattern attention mechanism

Wei Ye,
Haoxuan Kuang,
Jun Li
et al.

Abstract: Parking occupancy prediction is an important reference for travel decisions and parking management. However, due to various related factors, such as commuting or traffic accidents, parking occupancy has complex change features that are difficult to model accurately, thus making it difficult for parking occupancy to be accurately predicted. Moreover, how to give appropriate weights to these changing features in prediction becomes a new challenge in the era of machine learning. To tackle these challenges, a park… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 34 publications
0
1
0
Order By: Relevance
“…This RNN is capable of accurately realizing linear self-attention, which is a key component of the Transformer Network. Ye et al [21] proposed a novel temporal attention model that can assign appropriate weights to time-varying features during the prediction process. A bidirectional gated recurrent unit (GRU) based network intrusion detection model with hierarchical attention mechanism is presented [22].…”
Section: Combine the Attention Mechanismmentioning
confidence: 99%
“…This RNN is capable of accurately realizing linear self-attention, which is a key component of the Transformer Network. Ye et al [21] proposed a novel temporal attention model that can assign appropriate weights to time-varying features during the prediction process. A bidirectional gated recurrent unit (GRU) based network intrusion detection model with hierarchical attention mechanism is presented [22].…”
Section: Combine the Attention Mechanismmentioning
confidence: 99%