2023
DOI: 10.1016/j.ijepes.2022.108743
|View full text |Cite
|
Sign up to set email alerts
|

Probabilistic forecasting method for mid-term hourly load time series based on an improved temporal fusion transformer model

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 39 publications
(9 citation statements)
references
References 28 publications
0
9
0
Order By: Relevance
“…Finally, TFT models have been introduced only recently for multi-horizon time series forecasting in the literature [10]. Nevertheless, they have been used for MTLF predictions over the energy load [39], where they have demonstrated substantial improvements in forecasting accuracy as well as the incorporation of uncertainty estimation in time series forecasting. However, the data that are used for training do not include environmental factors and categorical values and are also performed with sufficient resources in terms of memory, processing power and GPU availability.…”
Section: Discussionmentioning
confidence: 99%
“…Finally, TFT models have been introduced only recently for multi-horizon time series forecasting in the literature [10]. Nevertheless, they have been used for MTLF predictions over the energy load [39], where they have demonstrated substantial improvements in forecasting accuracy as well as the incorporation of uncertainty estimation in time series forecasting. However, the data that are used for training do not include environmental factors and categorical values and are also performed with sufficient resources in terms of memory, processing power and GPU availability.…”
Section: Discussionmentioning
confidence: 99%
“…By introducing the self-attention mechanism, the Transformer model can model the relationship between different positions in the sequence more effectively. For knowledge graph construction, it is necessary to extract entities from the knowledge domain and label their knowledge labels so that the knowledge graph can be visualized [21].…”
Section: Model Structure Of Transformermentioning
confidence: 99%
“…Consequently, it provides a more precise determination of the upper and lower bounds of PIs. The model introduced in [37] leverages GRU and an enhanced temporal fusion transformer, which significantly improves the model's ability to capture long-term dependencies in the data. Furthermore, this model includes quantile constraints and PI penalty terms to eliminate quantile crossover and generate concise PIs.…”
Section: Probabilistic Stlfmentioning
confidence: 99%