2010
DOI: 10.1007/s11633-010-0517-5
|View full text |Cite
|
Sign up to set email alerts
|

Improving decision tree performance by exception handling

Abstract: This paper focuses on improving decision tree induction algorithms when a kind of tie appears during the rule generation procedure for specific training datasets. The tie occurs when there are equal proportions of the target class outcome in the leaf node s records that leads to a situation where majority voting cannot be applied. To solve the above mentioned exception, we propose to base the prediction of the result on the naive Bayes (NB) estimate, k-nearest neighbour (k-NN) and association rule mining (ARM)… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2011
2011
2020
2020

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(2 citation statements)
references
References 21 publications
0
2
0
Order By: Relevance
“…The splitting process would be stopped if stopping rules are met, such as tree depth or minimum number of records in a leaf. Parameters that may influence the performance of a decision tree model include the maximum number of features used for splitting, the maximum depth of the decision tree, the minimum number of samples in a leaf node, and the minimum number of samples to split [53,54]. Among all the parameters, the minimum number of samples to split and the maximum depth of the decision tree are two of the most important that affect the performance of the model and should be seriously considered when adjusting parameters [55][56][57].…”
Section: Description Of Classifiersmentioning
confidence: 99%
“…The splitting process would be stopped if stopping rules are met, such as tree depth or minimum number of records in a leaf. Parameters that may influence the performance of a decision tree model include the maximum number of features used for splitting, the maximum depth of the decision tree, the minimum number of samples in a leaf node, and the minimum number of samples to split [53,54]. Among all the parameters, the minimum number of samples to split and the maximum depth of the decision tree are two of the most important that affect the performance of the model and should be seriously considered when adjusting parameters [55][56][57].…”
Section: Description Of Classifiersmentioning
confidence: 99%
“…However, none of these researches, on the discovery of exceptions stated so far, use decision trees as the rule induction method. We have only found two research works [28], [16] that deal with exception handling while performing classification using a decision tree algorithm. These works modify the rule generation algorithm to resolve ties between classes that emerge at leaf node during the design process.…”
Section: Literature Reviewmentioning
confidence: 99%