2015
DOI: 10.1080/14697688.2015.1032541
|View full text |Cite
|
Sign up to set email alerts
|

Evolution of high-frequency systematic trading: a performance-driven gradient boosting model

Abstract: This paper proposes a performance-driven gradient boosting model (pdGBM) which predicts shorthorizon price movements by combining nonlinear response functions of selected predictors. This model performs gradient descent in a constrained functional space by directly minimizing loss functions customized with different trading performance measurements. To demonstrate its practical applications, a simple trading system was designed with trading signals constructed from pdGBM predictions and fixed holding period in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
6
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 6 publications
(6 citation statements)
references
References 51 publications
0
6
0
Order By: Relevance
“…Thus, we may consider a richer structure of MCB with multiple LBMs or even multiple UBMs. In addition, MCB can be extended to other classes models, such as, GLM and time series models (Meier et al, 2008;Zhou et al, 2015). We leave these topics for future research.…”
Section: Discussionmentioning
confidence: 99%
“…Thus, we may consider a richer structure of MCB with multiple LBMs or even multiple UBMs. In addition, MCB can be extended to other classes models, such as, GLM and time series models (Meier et al, 2008;Zhou et al, 2015). We leave these topics for future research.…”
Section: Discussionmentioning
confidence: 99%
“…We have used the GBM implementation from Scikit-learn library (Pedregosa et al 2011) for all our experiments. Furthermore, note that different variants of tree boosting have been empirically proven to be state-of-the-art methods in predictive tasks across different machine learning challenges (Bentéjac et al 2020;Chen and Guestrin 2016;Lu and Mazumder 2020;Taieb and Hyndman 2014;Gulin et al 2011) and more recently in finance (Zhou et al 2015;Sun et al 2018). Note, that although by default GBM does not provide confidence intervals, we are not claiming that is not possible to construct confidence intervals for GBM.…”
Section: Conclusion and Discussionmentioning
confidence: 93%
“…GBM is the gradient boosting machine (Friedman 2001). It has been empirically proven to be highly effective in predictive tasks across different machine learning challenges (Gulin et al 2011;Taieb and Hyndman 2014) and more recently in finance (Zhou et al 2015;Sun et al 2018). The feature vector fed into GBM is also the concatenation of features from order book and transaction data of two markets.…”
Section: Baseline Models and Tme Setupmentioning
confidence: 99%
“…See Appendix for more details. Furthermore, note that different variants of tree boosting have been empirically proven to be state-of-the-art methods in predictive tasks across different machine learning challenges [26,27] and more recently in finance [28,29].…”
Section: Machine Learning Benchmarkmentioning
confidence: 99%
“…m h(x t ; a m )) (28). However, for practical purposes first we make the initial guess F 0 (x) = arg min c T t=1 Ψ (y t , c) and then parameters are jointly fit in a forward incremental way m = 1, ..., M :(β m , a m ) = arg min β,a T t=1 Ψ (y t , F m−1 (x t ) + βh(x t ; a))(29)andF m (x t ) = F m−1 (x t ) + β m h(x t ; a m ).…”
mentioning
confidence: 99%