1996
DOI: 10.1007/bf00058613
|View full text |Cite
|
Sign up to set email alerts
|

Efficient incremental induction of decision trees

Abstract: Abstract. This paper proposes a method to improve 1D5R, an incremental TDIDT algorithm. The new method evaluates the quality of attributes selected at the nodes of a decision tree and estimates a minimum number of steps for which these attributes are guaranteed such a selection. This results in reducing overheads during incremental learning. The method is supported by theoretical analysis and experimental results.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
37
0

Year Published

1997
1997
2020
2020

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 46 publications
(37 citation statements)
references
References 15 publications
0
37
0
Order By: Relevance
“…The online random forest as proposed by [58] incrementally adopts new features. Updating decision trees with new samples was described by [59,60] and extended by [61,62]. Update schemes for pools of experts like the WINNOW and Weighted Majority Algorithm were introduced by [63,64] and successfully employed since then.…”
Section: Inter-active and Online Learning For Clinical Applicationmentioning
confidence: 99%
“…The online random forest as proposed by [58] incrementally adopts new features. Updating decision trees with new samples was described by [59,60] and extended by [61,62]. Update schemes for pools of experts like the WINNOW and Weighted Majority Algorithm were introduced by [63,64] and successfully employed since then.…”
Section: Inter-active and Online Learning For Clinical Applicationmentioning
confidence: 99%
“…By representing the stereotyping function in this way, we can make use of well-known techniques for inducing decision trees from labeled examples [Frank et al 1998;Kalles and Morris 1996] encapsulate all of an agent's stereotypes about others regarding features in F in one concise structure. Each node of the tree represents a particular feature, and branches from nodes are followed depending on the perceived value of the feature represented by that node.…”
Section: Learning Stereotypesmentioning
confidence: 99%
“…Decision trees. The problem of processing streaming data in online has motivated the development of many algorithms which were designed to learn decision trees incrementally [16], [17]. Some examples of the algorithms which construct incremental decision trees are: ID4 [18], ID5 [19], and ID5R [20].…”
Section: Background and Related Workmentioning
confidence: 99%