2021
DOI: 10.1016/j.ijforecast.2020.06.008
|View full text |Cite
|
Sign up to set email alerts
|

Recurrent Neural Networks for Time Series Forecasting: Current status and future directions

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
385
0
18

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 736 publications
(406 citation statements)
references
References 39 publications
3
385
0
18
Order By: Relevance
“… Zeroual et al, 2020 , Tomar and Gupta, 2020 , Czarnowski et al, 2008 , El Zowalaty and Järhult, 2020 , Shahid et al, 2020 , Hewamalage et al, 2021 , Jiménez et al, 2020 , Kaushik et al, 2020 , BhedadJamshidi et al, 2020 , Ribeiro et al, 2020 , Naudé, 2020 , Arora et al, 2020 , Ribeiro et al, 2020 , Ogundokun et al, 2020 , Alzahrani et al, 2020 , Shastri et al, 2020 , Alakus and Turkoglu, 2020 , Papastefanopoulos et al, 2020 , Chimmula and Zhang, 2020 , Wang et al, 2020 , Wang et al, 2020 , DataGov , Car et al, 2020 .…”
Section: Uncited Referencesmentioning
confidence: 99%
“… Zeroual et al, 2020 , Tomar and Gupta, 2020 , Czarnowski et al, 2008 , El Zowalaty and Järhult, 2020 , Shahid et al, 2020 , Hewamalage et al, 2021 , Jiménez et al, 2020 , Kaushik et al, 2020 , BhedadJamshidi et al, 2020 , Ribeiro et al, 2020 , Naudé, 2020 , Arora et al, 2020 , Ribeiro et al, 2020 , Ogundokun et al, 2020 , Alzahrani et al, 2020 , Shastri et al, 2020 , Alakus and Turkoglu, 2020 , Papastefanopoulos et al, 2020 , Chimmula and Zhang, 2020 , Wang et al, 2020 , Wang et al, 2020 , DataGov , Car et al, 2020 .…”
Section: Uncited Referencesmentioning
confidence: 99%
“…Alternative NN architectures, known as recurrent neural networks (RNN), are also employed for forecasting, where connectivity information forming the input-output mapping is stored as hidden states. This RNN holds a recursive structure that implies that fewer parameters are demanded than NAR models to learn the relation input-output in data [58]. LSTM are architectures based on RNN, where the ability to preserve part of the information that belongs to the hidden layer can be used in specific moments for forecasting [59].…”
Section: Long Short Term Memory Modelmentioning
confidence: 99%
“…The LSTM network was developed to deal with the exploding and vanishing gradient problems encountered during the training of traditional RNNs, as well as to solve the long-term sequence dependence problem of RNNs. Deep learning architecture consist of LSTM layer (network) can address difficult sequence problems in deep learning and achieves state-of-the-art results in time series prediction [22]. Figure 1.…”
Section: Deep Learning With Long Short-term Memory (Lstm) Networkmentioning
confidence: 99%