2022
DOI: 10.1016/j.epsr.2021.107761
|View full text |Cite
|
Sign up to set email alerts
|

Short-Term Load Forecasting Using Channel and Temporal Attention Based Temporal Convolutional Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 88 publications
(29 citation statements)
references
References 35 publications
0
12
0
Order By: Relevance
“…These models have recently drawn considerable attention in the field of STLF as they can result to relatively higher accuracy, lower training times, and more interpretable forecasts [33,39]. Specifically, Tang et al [59] employed a TCN architecture with channel and temporal attention mechanisms to exploit the non-linear relationships between weather factors and load. Yin and Xie [65] applied a multi-temporalspatial-scale TCN for forecasting the load of a city in China, while Gu and Jia [15] compared the TCN architecture with three traditional models, suggesting its superiority when used to forecast the load of a certain region in Shanghai.…”
Section: Related Workmentioning
confidence: 99%
“…These models have recently drawn considerable attention in the field of STLF as they can result to relatively higher accuracy, lower training times, and more interpretable forecasts [33,39]. Specifically, Tang et al [59] employed a TCN architecture with channel and temporal attention mechanisms to exploit the non-linear relationships between weather factors and load. Yin and Xie [65] applied a multi-temporalspatial-scale TCN for forecasting the load of a city in China, while Gu and Jia [15] compared the TCN architecture with three traditional models, suggesting its superiority when used to forecast the load of a certain region in Shanghai.…”
Section: Related Workmentioning
confidence: 99%
“…A TCN model is a type of neural network designed for sequential data processing (Tang et al, 2022). It uses 1D convolutional layers to extract features from the input sequence and has been shown to be effective for different tasks.…”
Section: Temporal Convolutional Network (Tcn)mentioning
confidence: 99%
“…The authors reported that a simple convolutional architecture is more effective across diverse sequence modelling tasks than recurrent architectures such as LSTMs and GRUs. TCNs have been recently validated for STLF tasks [55], [56] and are employed throughout our case study.…”
Section: Model Selectionmentioning
confidence: 99%