Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence 2019
DOI: 10.24963/ijcai.2019/306
|View full text |Cite
|
Sign up to set email alerts
|

Learn Smart with Less: Building Better Online Decision Trees with Fewer Training Examples

Abstract: Online  decision  tree  models  are  extensively  used in  many  industrial  machine  learning  applications for real-time classification tasks. These models are highly accurate, scalable and easy to use in practice. The Very Fast Decision Tree (VFDT) is the classic  online  decision  tree  induction  model  that has been widely adopted due to its theoretical guarantees  as  well  as  competitive  performance.  However, VFDT and its variants solely rely on conservative statistical measures like Hoeffding bound… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 10 publications
(8 citation statements)
references
References 15 publications
0
8
0
Order By: Relevance
“…In the research [12], Ariyam Das et al presented a memory-efficient bootstrap simulation heuristic (Mem-ES) that effectively accelerates the learning process. Experiments show that performing resampling techniques efficiently speeds up node splits for online decision tree learning.…”
Section: Related Workmentioning
confidence: 99%
“…In the research [12], Ariyam Das et al presented a memory-efficient bootstrap simulation heuristic (Mem-ES) that effectively accelerates the learning process. Experiments show that performing resampling techniques efficiently speeds up node splits for online decision tree learning.…”
Section: Related Workmentioning
confidence: 99%
“…Except [4], the only existing dynamic algorithms for decision trees are incremental: they receive a stream of labeled examples and maintain a decision tree that performs well compared to the tree built on the examples seen so far. The first such algorithms were Hoeffding trees [9], which spurred a line of research on trees that adapt to so-called concept drifts [14,11,18,7,24,12,16,21]; see [17] for a survey. Unfortunately, all those algorithms assume the examples are i.i.d., which allows them to compute splits that have nearly-maximum gain with high confidence.…”
Section: Related Workmentioning
confidence: 99%
“…Several lines of research focus on different aspects of the tree learning problem -examples include shallow and sparse trees [Kumar et al, 2017], high-capacity decision nodes [Kumar et al, 2017, Balestriero, 2017 and/or leaf nodes [Hsieh et al, 2014], efficient inference [Jose et al, 2013], soft decisions [Frosst & Hinton, 2017], joint representation learning and tree learning [Kontschieder et al, 2015], and incremental tree learning in streaming scenarios [Domingos & Hulten, 2000, Jin & Agrawal, 2003, Manapragada et al, 2018, Das et al, 2019 (see Remark 1).…”
Section: Related Workmentioning
confidence: 99%
“…Remark 1 (Bandit vs standard online setting). The popular line of fast online/incremental tree induction/learning methods [Domingos & Hulten, 2000, Jin & Agrawal, 2003, Manapragada et al, 2018, Das et al, 2019 work in the setting where (a) labeled examples, i.e. (x i , y i ) arrive online, and (b) loss is implicit or known.…”
Section: Problem Setupmentioning
confidence: 99%