Proceedings of the 2020 ACM-IMS on Foundations of Data Science Conference 2020
DOI: 10.1145/3412815.3416886
|View full text |Cite
|
Sign up to set email alerts
|

Classification Acceleration via Merging Decision Trees

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
22
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
1

Relationship

3
5

Authors

Journals

citations
Cited by 14 publications
(22 citation statements)
references
References 22 publications
0
22
0
Order By: Relevance
“…As summarized in a recent paper on trees (Fan and Li, 2020), there are two recent implementation tricks that have made boosted tree models considerably more practical:…”
Section: Boosted Tree Modelsmentioning
confidence: 99%
“…As summarized in a recent paper on trees (Fan and Li, 2020), there are two recent implementation tricks that have made boosted tree models considerably more practical:…”
Section: Boosted Tree Modelsmentioning
confidence: 99%
“…The above developments were summarized in a recent paper on trees (Fan and Li, 2020). Readers are also referred to some interesting discussions in 2010 https://hunch.net/?p=1467.…”
Section: Introductionmentioning
confidence: 99%
“…As always, we should first solute to pioneers in boosting and trees, e.g., Brieman et al (1983); Schapire (1990); Freund (1995); Freund and Schapire (1997); Bartlett et al (1998); Schapire and Singer (1999); Friedman et al (2000); Friedman (2001). As summarized in a recent paper on merging decision trees (Fan and Li, 2020), in the past 15 years or so, multiple practical developments have enhanced the performance as well as the efficiency of boosted tree algorithms, including • The explicit (and robust) formula for tree-split criterion using the second-order gain information (Li, 2010b) (i.e., "Robust LogitBoost") typically improves the accuracy, compared to the implementation based on the criterion of using only the first-order gain information (Friedman, 2001). It is nowadays the standard implementation in popular tree platforms.…”
Section: Introductionmentioning
confidence: 99%