2018 24th International Conference on Pattern Recognition (ICPR) 2018
DOI: 10.1109/icpr.2018.8545666
|View full text |Cite
|
Sign up to set email alerts
|

Recurrent Neural Networks for Financial Time-Series Modelling

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
9
0

Year Published

2019
2019
2025
2025

Publication Types

Select...
4
3
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 19 publications
(9 citation statements)
references
References 13 publications
0
9
0
Order By: Relevance
“…Such recurrent architectures by nature are tailored for modelling sequential data with delayed temporal correlations. [18] Many recent works with RNNs have shown good promise in econometric price prediction using either technical indicators [19,20] or social sentiments [21,22].…”
Section: Artificial Neural Network (Anns)mentioning
confidence: 99%
“…Such recurrent architectures by nature are tailored for modelling sequential data with delayed temporal correlations. [18] Many recent works with RNNs have shown good promise in econometric price prediction using either technical indicators [19,20] or social sentiments [21,22].…”
Section: Artificial Neural Network (Anns)mentioning
confidence: 99%
“…Both long-term memory (c[t-1]) and short-term memory (h[t-1]) are processed in a typical LSTM algorithm through the utilization of multiple gates to filter the information showed in the Figure 5. For an unchanged flow of gradients, forget and update gates update the memory cell state [73,74]. Three gates i.e., input gate ig, forgot gate fg and output gate og handle the information flow by writing, deleting, and reading respectively.…”
Section: Long Short-term Memory (Lstm) Recurrent Neuralmentioning
confidence: 99%
“…They are composed of LSTM cells capable of capturing long-term dependencies in sequences while attenuate gradient vanishing/exploding problem [28]. This capacity is achieved by the use of forget and update gates to modify memory cell state that allow gradients to also flow unchanged [29,30]. The LSTM memory cells are composed by self-loops that encoded temporal information in the cell states, and three regulators gates that operate the flow of information within each cell.…”
Section: Long Short Term Memorymentioning
confidence: 99%