2023
DOI: 10.48550/arxiv.2301.02068
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Towards Long-Term Time-Series Forecasting: Feature, Pattern, and Distribution

Abstract: Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning. Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism. Though one could lower the complexity of Transformers by inducing the sparsity in point-wise self-attentions for LTTF, the limited information utilization prohibits the model from exploring the complex dependencies comprehensively. To this end, we propos… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(5 citation statements)
references
References 28 publications
0
5
0
Order By: Relevance
“…6, the generalizability of our joint imputation and forecasting system is evaluated versus fluctuations and new unseen user behavior (which are new mobility patterns tuned in this testing period). We compare our approach to RNN-based and transformer-based approaches which are well known models for predicting temporal dynamics [37]. From Fig.…”
Section: Simulation Results and Analysismentioning
confidence: 99%
“…6, the generalizability of our joint imputation and forecasting system is evaluated versus fluctuations and new unseen user behavior (which are new mobility patterns tuned in this testing period). We compare our approach to RNN-based and transformer-based approaches which are well known models for predicting temporal dynamics [37]. From Fig.…”
Section: Simulation Results and Analysismentioning
confidence: 99%
“…The Conformer model shows significantly improved performance and training speed due to innovations such as incorporating multi-variable and temporal dependencies, utilizing both stationary and instant recurrent network blocks, and implementing a hybrid convolutional module. The Conformer model developed by Yan Li et al [22] in 2023 efficiently and stably predicts long-period sequences with obvious periodicity in multivariate time-series data, addressing associated computational efficiency and stability issues. The Conformer model comprises three main components: the input representation block, the encoder-decoder architecture, and the normalizing flow block.…”
Section: The Principle Of Conformer Modelmentioning
confidence: 99%
“…Based on multi-scale dynamics, a time-series frequently displays distinctive temporal patterns at different resolutions [22]. In order to extract the temporal patterns across various scales, this study extracted the four temporal frequencies of the year, month, week and day to establish a temporal resolution set T ⊆ {year, month, week, day}.…”
Section: • Multi-scale Temporal Patternsmentioning
confidence: 99%
See 2 more Smart Citations