2016
DOI: 10.4314/bajopas.v9i1.4
|View full text |Cite
|
Sign up to set email alerts
|

Time series prediction with simple recurrent neural networks

Abstract: Simple recurrent neural networks are widely used in time series prediction. Most researchers and application developers often choose arbitrarily between Elman or Jordan simple recurrent neural networks for their applications. A hybrid of the two called Elman-Jordan (or Multi-recurrent) neural network is also being used. In this study, we evaluated the performance of these neural networks on three established bench mark time series prediction problems. Results from the experiments showed that Jordan neural netw… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0
1

Year Published

2017
2017
2023
2023

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 14 publications
(7 citation statements)
references
References 11 publications
0
6
0
1
Order By: Relevance
“…Unlike the JORDAN cell, each time step in the ELMAN cell calls the previous hidden state (Figure 7b). In 2016, a combination of both JORDAN and ELMAN cells in one cell named multi-recurrent neural network (MRNN) was evaluated by Abdulkarim (2016). In this cell structure, at each time step, both previous output and hidden states are presented to the cell (Figure 7c).…”
Section: Taxonomy Of Rnn Cellsmentioning
confidence: 99%
“…Unlike the JORDAN cell, each time step in the ELMAN cell calls the previous hidden state (Figure 7b). In 2016, a combination of both JORDAN and ELMAN cells in one cell named multi-recurrent neural network (MRNN) was evaluated by Abdulkarim (2016). In this cell structure, at each time step, both previous output and hidden states are presented to the cell (Figure 7c).…”
Section: Taxonomy Of Rnn Cellsmentioning
confidence: 99%
“…RNNs represent one of the most optimal choices when working with data organized in time series models. Their work principle is based on combining nonlinear activation functions in a recurrent structure, which makes prediction possible and provides improved prediction accuracy, as stated in [1]. In contrast to the standard Neural Networks, which are usually represented using feedforward architectures, RNNs allow the information to be transferred both forward and backward, with the help of their feedback connections.…”
Section: Lstm-based Forecasting Of Electricity Consumption and Producmentioning
confidence: 99%
“…For time series forecasting, this work has explored various machine learning methods. These methods are divided into traditional statistical methods and deep learning-based techniques, such as Auto-Regressive Integrated Moving Average (ARIMA) [63], and recurrent neural networks including long short-term memory (LSTM) networks [64], bi-direction recurrent neural networks, Elman networks, Jordan networks [65], and recursive neural networks. Particularly, we have explored the ability of different network configurations to capture the behavior of the data, and their performance, measured in terms of the root mean squared error (RMSE), is shown in Table 4.…”
Section: A System Modelmentioning
confidence: 99%