Recurrent Neural Networks for Temporal Data Processing 2011
DOI: 10.5772/16015
|View full text |Cite
|
Sign up to set email alerts
|

Advanced Methods for Time Series Prediction Using Recurrent Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
16
0
2

Year Published

2013
2013
2023
2023

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 17 publications
(18 citation statements)
references
References 33 publications
0
16
0
2
Order By: Relevance
“…The total number of inputs is p + m, the number of hidden neurons is q, while φ is the context unit associated with the recurrent connection that keeps the content of output from the previous training stage [9]. It represents the long-term memory of the network [34]. This type of structure is convenient to derive some special cases.…”
Section: Modeling Strategy and Datamentioning
confidence: 99%
“…The total number of inputs is p + m, the number of hidden neurons is q, while φ is the context unit associated with the recurrent connection that keeps the content of output from the previous training stage [9]. It represents the long-term memory of the network [34]. This type of structure is convenient to derive some special cases.…”
Section: Modeling Strategy and Datamentioning
confidence: 99%
“…1   is an integer. We used the following parameters: [4]. 500 values were used for training; the next 100 values were used for testing.…”
Section: Mackey-glass Chaotic Processmentioning
confidence: 99%
“…Neural networks are known as an effective and friendly tool for black-box modeling of unknown plant's dynamics [1]. Usually, neural networks are trained to perform single-step-ahead (SS) predictions, where the predictor uses some available input and output observations to estimate the variable of interest for the time step immediately following the latest observation [2][3][4]. However, recently there has been growing interest in multi-step-ahead (MS) predictions, where the values of interest must be predicted for some horizon in the future.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Since RNNs are able to simulate any open dynamical system, they have a broad spectrum of applications such as time series forecasting [2], control of plants [3], language modeling [4], speech recognition, neural machine translation [5] and other domains. The easiest way to create an RNN is adding the feedback connections to the hidden layer of multilayer perceptron.…”
Section: Introductionmentioning
confidence: 99%