2013 IEEE International Conference on Computer Vision 2013
DOI: 10.1109/iccv.2013.31
|View full text |Cite
|
Sign up to set email alerts
|

Revisiting Example Dependent Cost-Sensitive Learning with Decision Trees

Abstract: Abstract

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2015
2015
2018
2018

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 29 publications
0
5
0
Order By: Relevance
“…The RUSBoost method has been shown to provide a good solution to this problem in our work, which utilizes a sampling technique to handle class imbalance. Other recent approaches ( Mac Aodha and Brostow, 2013 , Bahnsen et al, 2015 ) may provide an alternative way for handling the class imbalance problem. Furthermore, in this work, different types of features were concatenated for classification.…”
Section: Discussionmentioning
confidence: 99%
“…The RUSBoost method has been shown to provide a good solution to this problem in our work, which utilizes a sampling technique to handle class imbalance. Other recent approaches ( Mac Aodha and Brostow, 2013 , Bahnsen et al, 2015 ) may provide an alternative way for handling the class imbalance problem. Furthermore, in this work, different types of features were concatenated for classification.…”
Section: Discussionmentioning
confidence: 99%
“…So, for example, the cost of misclassifying a process plant as safe is likely to be much higher than the cost of misclassifying a safe plant as unsafe. The past three decades have seen a significant interest in this problem, known as cost-sensitive induction, with the development of a number of independent algorithms [8,19,22,34,35,36,37,38,62,63]. Authors have also recognised that in practice there are other costs involved [5,7,8,9].…”
Section: Cost-sensitive Decision Tree Learningmentioning
confidence: 99%
“…Lastly, the base classifiers are combined using either majority voting, cost-sensitive weighted voting and costsensitive stacking. Majority voting consists in collecting the predictions of each base classifier and selecting the decision with the highest number of votes, see (7).…”
Section: Algorithmsmentioning
confidence: 99%
“…Also, none of these methods uses take into account the cost of correct classification. Moreover, the literature on example-dependent costsensitive methods is limited, often because there is a lack of publicly available datasets that fit the problem [7]. Recently, we have proposed different methods that take into account the different example-dependent costs, in particular: Bayes minimum risk (BM R) [8], cost-sensitive logistic regression [9], and cost-sensitive decision tree (CSDT ) [10].…”
Section: Introductionmentioning
confidence: 99%