2019
DOI: 10.48550/arxiv.1909.00590
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Recurrent Neural Networks for Time Series Forecasting: Current Status and Future Directions

Hansika Hewamalage,
Christoph Bergmeir,
Kasun Bandara

Abstract: Recurrent Neural Networks (RNN) have become competitive forecasting methods, as most notably shown in the winning method of the recent M4 competition. However, established statistical models such as ETS and ARIMA gain their popularity not only from their high accuracy, but they are also suitable for non-expert users as they are robust, efficient, and automatic. In these areas, RNNs have still a long way to go. We present an extensive empirical study and an open-source software framework of existing RNN archite… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
14
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 13 publications
(16 citation statements)
references
References 46 publications
0
14
0
Order By: Relevance
“…• Scale Normalization as Preprocessing: Several forms of preprocessing have been recommended for global methods [31,33,3,42]. They are considered an important component of the performance of the methods.…”
Section: Modern Practices and Model Classes For Global Methodsmentioning
confidence: 99%
“…• Scale Normalization as Preprocessing: Several forms of preprocessing have been recommended for global methods [31,33,3,42]. They are considered an important component of the performance of the methods.…”
Section: Modern Practices and Model Classes For Global Methodsmentioning
confidence: 99%
“…In order to make the data to fit the LSTM model, a preprocessing step is necessary to normalize raw data between 0 and 1 [33]. This process allows to adjust data on a common magnitude scale, providing a more effective weights adjustments for the neural networks [32]. The normalization is performed by Equation ( 7)…”
Section: Datasetmentioning
confidence: 99%
“…Recent trends in ML such as deep learning, especially deep recurrent NNs (RNNs), are very attractive for time series forecasting [15]. RNNs with connections between nodes forming a directed graph along a temporal sequence are able to exhibit temporal dynamic behavior using their internal state (memory) to process sequences of inputs.…”
Section: Introductionmentioning
confidence: 99%