2013
DOI: 10.7465/jkdi.2013.24.1.161
|View full text |Cite
|
Sign up to set email alerts
|

The study of foreign exchange trading revenue model using decision tree and gradient boosting

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2013
2013
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 10 publications
(3 citation statements)
references
References 4 publications
0
3
0
Order By: Relevance
“…The GB model has no randomness and builds trees, the depth of which does not exceed five per tree. Therefore, the GB modeling method can be said to connect as many shallow trees as possible [26]. Friedman's (2001) GB algorithm is as follows [5,27].…”
Section: Overview Of Gradient Boosting and Research Casesmentioning
confidence: 99%
See 1 more Smart Citation
“…The GB model has no randomness and builds trees, the depth of which does not exceed five per tree. Therefore, the GB modeling method can be said to connect as many shallow trees as possible [26]. Friedman's (2001) GB algorithm is as follows [5,27].…”
Section: Overview Of Gradient Boosting and Research Casesmentioning
confidence: 99%
“…. , C m are combined into one classifier by the classifier C k with weight log(γ m ) to create a final classifier [26].…”
Section: Overview Of Gradient Boosting and Research Casesmentioning
confidence: 99%
“…Boosting is an iterative procedure used to change the distribution of data so that the basic classifier focuses on hard-to-classify data. Among them, gradient boosting (GBM) forms a final classifier by repeating the process of giving a high weight for the incorrectly classified observation value, but a low weight for the correctly classified observation value to the next observation by starting with the same weight for the observed data [27]. It approximates the function f, which is as shown in Equation ( 4), based on a linear combination of weak learners h. It is optimized in a greedy manner by selecting parameter θ j and weight α j of a weak learner iteratively [28].…”
Section: Gbm (Gradient Boosting Algorithm)mentioning
confidence: 99%