2011
DOI: 10.1109/tpami.2010.71
|View full text |Cite
|
Sign up to set email alerts
|

Cost-Sensitive Boosting

Abstract: A novel framework, based on the statistical interpretation of boosting, is proposed for the design of cost sensitive boosting algorithms. It is argued that, although predictors produced with boosting converge to the ratio of posterior class probabilities that also appears in Bayes decision rule, this convergence only occurs in a small neighborhood of the optimal cost-insensitive classification boundary. This is due to a combination of the cost-insensitive nature of current boosting losses, and boosting's sampl… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
142
0
2

Year Published

2012
2012
2018
2018

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 158 publications
(144 citation statements)
references
References 48 publications
0
142
0
2
Order By: Relevance
“…As for the other variants, they are all methods that modify the training algorithm. CSB0 and CSB1 [17] do not use confidence rated predictions and based on the results of comparative studies [9,10,15], the two variants are typically dominated by CSB2. Asymmetric-Adaboost [18] was excluded from said studies as being similar to CSB2.…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…As for the other variants, they are all methods that modify the training algorithm. CSB0 and CSB1 [17] do not use confidence rated predictions and based on the results of comparative studies [9,10,15], the two variants are typically dominated by CSB2. Asymmetric-Adaboost [18] was excluded from said studies as being similar to CSB2.…”
Section: Discussionmentioning
confidence: 99%
“…AdaCost [2,17] is also outperformed by AdaC2 and CSB2 and so is AdaC3 [16]. CS-AdaBoost [9,10], despite being the only method other than AdaC2 with a solid theoretical basis, has been characterized as 'time-consuming and imprecise' [19], as it lacks a closed form solution for α t and the optimization of its parameters is therefore computationally intensive.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Bagging [10] and boosting [30] are two popular methods for building ensembles of classifiers with a rich history of extensions [17,31,39,61,74,78]. In this section we outline various approaches which have been taken to make bagging and boosting methods overcome concept drift.…”
Section: Bagging and Boosting Based Methodsmentioning
confidence: 99%