2023
DOI: 10.3389/fenrg.2023.1227979
|View full text |Cite
|
Sign up to set email alerts
|

Deep learning time pattern attention mechanism-based short-term load forecasting method

Wei Liao,
Jiaqi Ruan,
Yinghua Xie
et al.

Abstract: Accurate load forecasting is crucial to improve the stability and cost-efficiency of smart grid operations. However, how to integrate multiple significant factors for enhancing load forecasting performance is insufficiently investigated in previous studies. To fill the gap, this study proposes a novel hybrid deep learning model for short-term load forecasting. First, the long short-term memory network is utilized to capture patterns from historical load data. Second, a time pattern attention (TPA) mechanism is… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(1 citation statement)
references
References 28 publications
0
1
0
Order By: Relevance
“…an advanced LSTM-based dual-attention model, meticulously considering the myriad of influencing factors and the effects of time nodes on STLF. Liao et al [33], with their innovative fusion of LSTM and a time pattern attention mechanism, augmented STLF methodologies, emphasizing feature extraction and model versatility. By incorporating external factors, their comprehensive approach improved feature extraction and demonstrated superior performance compared to existing methods.…”
Section: Introductionmentioning
confidence: 99%
“…an advanced LSTM-based dual-attention model, meticulously considering the myriad of influencing factors and the effects of time nodes on STLF. Liao et al [33], with their innovative fusion of LSTM and a time pattern attention mechanism, augmented STLF methodologies, emphasizing feature extraction and model versatility. By incorporating external factors, their comprehensive approach improved feature extraction and demonstrated superior performance compared to existing methods.…”
Section: Introductionmentioning
confidence: 99%