2017
DOI: 10.1007/s10844-017-0446-7
|View full text |Cite
|
Sign up to set email alerts
|

Multi-class and feature selection extensions of Roughly Balanced Bagging for imbalanced data

Abstract: Roughly Balanced Bagging is one of the most efficient ensembles specialized for class imbalanced data. In this paper, we study its basic properties that may influence its good classification performance. We experimentally analyze them with respect to bootstrap construction, deciding on the number of component classifiers, their diversity, and ability to deal with the most difficult types of the minority examples. Then, we introduce two generalizations of this ensemble for dealing with a higher number of attrib… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
38
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
4
4

Relationship

1
7

Authors

Journals

citations
Cited by 67 publications
(38 citation statements)
references
References 51 publications
0
38
0
Order By: Relevance
“…On the other hand, the one‐vs‐one technique trains classifiers that are quadratic with respect to the number of class types, which can be costly when the number of classes is high. To handle class imbalance, various techniques can be used such as undersampling, oversampling, assigning weights to data samples belonging to different classes, and techniques such as multi‐class roughly balanced bagging 76 …”
Section: Resultsmentioning
confidence: 99%
“…On the other hand, the one‐vs‐one technique trains classifiers that are quadratic with respect to the number of class types, which can be costly when the number of classes is high. To handle class imbalance, various techniques can be used such as undersampling, oversampling, assigning weights to data samples belonging to different classes, and techniques such as multi‐class roughly balanced bagging 76 …”
Section: Resultsmentioning
confidence: 99%
“…The related standard approaches are Global-CS and Static-SMOTE as representatives of over sampling applied to single classifiers, decomposition with OVA and OVO ensembles with resampling of the binary classes done with random over sampling (ROS) or random under sampling (RUS) following recommendations of (Fernandez et al, 2013) and (Galar et al, 2011) and NCR as a more informative under sampling (Laurikkala, 2001), and newly introduced Multi-class Roughly Balanced Bagging, which showed good experimental results in the work of Lango and Stefanowski (2018).…”
Section: Methodsmentioning
confidence: 99%
“…Recently, yet another extension of Roughly Balanced Bagging to multiclass imbalanced data has been proposed by Lango and Stefanowski (2018).…”
Section: Related Work On Multiclass Imbalancesmentioning
confidence: 99%
“…Most-popular data-level approaches include extensions of the SMOTE algorithm into a multi-class setting [22,23,24], strategies using feature selection [25,26], and alternative methods for instance generation by using Mahalanobis distance [27,28]. Algorithm-level solutions include decision tree adaptations [29], cost-sensitive matrix learning [30], and ensemble solutions utilizing Bagging [31,32] and Boosting [5,33].…”
Section: Multi-class Imbalanced Problemsmentioning
confidence: 99%