2013
DOI: 10.1145/2431211.2431215
|View full text |Cite
|
Sign up to set email alerts
|

A survey of cost-sensitive decision tree induction algorithms

Abstract: The past decade has seen a significant interest on the problem of inducing decision trees that take account of costs of misclassification and costs of acquiring the features used for decision making. This survey identifies over 50 algorithms including approaches that are direct adaptations of accuracy based methods, use genetic algorithms, use anytime methods and utilize boosting and bagging. The survey brings together these different studies and novel approaches to cost-sensitive decision tree learning, provi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
118
0
1

Year Published

2014
2014
2024
2024

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 174 publications
(119 citation statements)
references
References 65 publications
0
118
0
1
Order By: Relevance
“…Lo max [7] had a profound summary for cost-sensitive decision tree in the paper and gave some classical costsensitive decision tree algorithm based on greedy method, such as EG2, CS-ID3, IDX, CS-C4.5, M inCost and PM, in which MinCost and PM are the most representative algorithms. The used formu la in splitting attributes selection is as follows:…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Lo max [7] had a profound summary for cost-sensitive decision tree in the paper and gave some classical costsensitive decision tree algorithm based on greedy method, such as EG2, CS-ID3, IDX, CS-C4.5, M inCost and PM, in which MinCost and PM are the most representative algorithms. The used formu la in splitting attributes selection is as follows:…”
Section: Related Workmentioning
confidence: 99%
“…For this reason, some researchers introduced the concept of cost-sensitive learning to the traditional classification algorithms, and produced many outstanding cost-sensitive classification algorithms, and cost-sensitive decision tree [7] is one of them. Cost-sensitive decision tree inherits the advantages of traditional decision tree algorithm, while the cost-sensitive decision tree also overcomes the traditional decision tree problem for ignoring various costs arising fro m the high cost of classification.…”
Section: Introductionmentioning
confidence: 99%
“…Domingos concluded that this algorithm provides goods results on large data sets. In addition, most researchers have dealt with this problem by changing the data distributions to reflect the costs, though most of them utilize a decision tree learner as a base learner, and the reader is referred Lomax and Vadera [4] for a comprehensive survey of cost sensitive decision tree algorithms for details.…”
Section: Review Of Previous Work On Sampling Approachmentioning
confidence: 99%
“…However, the performance of any learning algorithm, in practice, normally has to take the cost of misclassification into account. Hence, in recent years, a significant level of attention has been paid to cost-sensitive learning, including making accuracy-based learners cost-sensitive [3,4]. Zadrozny et al [6] divide cost-sensitive classifiers into two categories: the amending approach (changing the classifier to a transparent box) and the sampling approach (using the classifier as a black box).…”
Section: Introductionmentioning
confidence: 99%
“…Those that alter the learning algorithm appear to focus on either decision tree classifiers where cost sensitivity is achieved by pruning [10], or Support Vector Machines (SVM) that all use a hinge loss approximation of the cost-sensitive loss function [9]. A cost sensitive learning algorithm applicable as an augmentation to deep networks appeared only recently in the literature [7] and as with all the CSS has not considered the broader problem of shaping the entire error distribution, but only to make up for imbalance in the input class distribution.…”
Section: Introductionmentioning
confidence: 99%