2019
DOI: 10.48550/arxiv.1902.03125
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

High-performance stock index trading: making effective use of a deep LSTM neural network

Abstract: We present a deep long short-term memory (LSTM)-based neural network for predicting asset prices, together with a successful trading strategy for generating profits based on the model's predictions. Our work is motivated by the fact that the effectiveness of any prediction model is inherently coupled to the trading strategy it is used with, and vise versa. This highlights the difficulty in developing models and strategies which are jointly optimal, but also points to avenues of investigation which are broader … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 18 publications
0
4
0
Order By: Relevance
“…Some of these algorithms were based on gated recurrent neural networks (RNN), autoencoders, convolutional neural networks, bidirectional mechanisms, attention mechanisms, ensemble techniques, deep and vanilla architectures. Specific architectural design features of these 12 selected algorithms were: the gated LSTM architecture suggested [13], [14] and [15]; the bidirectional mechanism combined with both LSTMs and GRUs influenced [16]; the attention mechanism combined with gated neural networks [17]; deep convolutional neural network (CNN) ensemble with LSTM and an attention mechanism [18]; a GRU [19] autoencoders combined with LSTM [20] and finally a deep gated recurrent neural network architectures made up of both GRU and LSTM [21]. iii Evaluation-The following factors were considered as potential performance evaluation criteria with specific metrics: complexity measure through the total number of built parameters of every architecture, accuracy considered a mean absolute error (MAE) which is robust in environments associated with discrete irregular patterns when measuring the average magnitude of the errors in a set of predictions, without considering their direction.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Some of these algorithms were based on gated recurrent neural networks (RNN), autoencoders, convolutional neural networks, bidirectional mechanisms, attention mechanisms, ensemble techniques, deep and vanilla architectures. Specific architectural design features of these 12 selected algorithms were: the gated LSTM architecture suggested [13], [14] and [15]; the bidirectional mechanism combined with both LSTMs and GRUs influenced [16]; the attention mechanism combined with gated neural networks [17]; deep convolutional neural network (CNN) ensemble with LSTM and an attention mechanism [18]; a GRU [19] autoencoders combined with LSTM [20] and finally a deep gated recurrent neural network architectures made up of both GRU and LSTM [21]. iii Evaluation-The following factors were considered as potential performance evaluation criteria with specific metrics: complexity measure through the total number of built parameters of every architecture, accuracy considered a mean absolute error (MAE) which is robust in environments associated with discrete irregular patterns when measuring the average magnitude of the errors in a set of predictions, without considering their direction.…”
Section: Resultsmentioning
confidence: 99%
“…On the other hand, models and algorithms designed through gated sequential architectures in the form of LSTMs and GRUs have been widely used in such analysis environments [13], [14], [15]. Thus the guidance from the SeLFISA framework will influence the development of deep learning with artefacts that may demonstrate better performance over these suggested gated models.…”
Section: B Algorithmsmentioning
confidence: 99%
“…This model uses an attention mechanism to capture long-term dependencies and interactions among features and perform multiple time series forecasting. The research focuses, in particular, on applying the Transformer model in portfolio management, which is used for trend-following strategies and volatility forecasting in multi-period portfolio optimization [29]. Although the Transformer model shows strong results in forecasting time series, the research also highlights the challenges of applying machine learning in finance, such as maintaining a balance between model complexity, tuning quality, and forecast quality.…”
Section: Empirical Literature Overviewmentioning
confidence: 99%
“…We refer to Greff et al (2016) for an account on the history of LSTM and its various specifications. Other applications may be found in the work of Chalvatzisa and Hristu-Varsakelis (2019), Wu and Yan (2019), and Sirignano and Cont (2019).…”
Section: Introductionmentioning
confidence: 99%