2017
DOI: 10.1016/j.patcog.2017.07.024
|View full text |Cite
|
Sign up to set email alerts
|

Synthetic minority oversampling technique for multiclass imbalance problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
69
0
2

Year Published

2018
2018
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 212 publications
(71 citation statements)
references
References 17 publications
0
69
0
2
Order By: Relevance
“…To show the availability of our sampling strategy, we compare the performance of our algorithms (including AL-OR and IAL-IOR) with the random sampling (randomly select query samples). To show the generalization of our algorithm, we compare the performance of our algorithms with the stateof-the-art imbalance methods (including under-sampling (US) and an over-sampling methods (SMOTE [4])) and recent proposed imbalanced methods (including SMOR [8] and SMOM [41]). The performance of all algorithms will be [8] and SMOM [41] is available at the website https://github.com/zhutuanfei/SMOR, and the code of our algorithms is available at the website https://github.com/gjmrookie/active-learning-forimbalanced-OR.…”
Section: A Experimental Setup 1) Design Of Experimentsmentioning
confidence: 99%
“…To show the availability of our sampling strategy, we compare the performance of our algorithms (including AL-OR and IAL-IOR) with the random sampling (randomly select query samples). To show the generalization of our algorithm, we compare the performance of our algorithms with the stateof-the-art imbalance methods (including under-sampling (US) and an over-sampling methods (SMOTE [4])) and recent proposed imbalanced methods (including SMOR [8] and SMOM [41]). The performance of all algorithms will be [8] and SMOM [41] is available at the website https://github.com/zhutuanfei/SMOR, and the code of our algorithms is available at the website https://github.com/gjmrookie/active-learning-forimbalanced-OR.…”
Section: A Experimental Setup 1) Design Of Experimentsmentioning
confidence: 99%
“…Other popular over‐sampling approaches include Borderline‐SMOTE (BLSMOTE) by Han et al, 17 Safe‐Level‐SMOTE (SLSMOTE) by Bunkhumpornpat et al, 18 RACOG (rapidly converging the Gibbs algorithm) by Das et al, 16 MDO (Mahalanobis distance‐based over‐sampling algorithm) by Abdi and Hashemi, 19 A‐SUWO (adaptive semi‐unsupervised weighted over‐sampling algorithm) by Nekooeimehr and Lai‐Yuen 20 and SMOM ( k ‐nearest neighbours‐based synthetic minority over‐sampling algorithm) by Zhu et al, 21 and so forth. However, due to the limited space of this paper, it is impossible to cover all the data sampling approaches in the literature, interested readers may refer to References [6,22,23] for more details.…”
Section: Related Workmentioning
confidence: 99%
“…Douzas et al [47] proposed k-means-SMOTE by combining k-means clustering and SMOTE, which avoids the generation of noise and effectively overcame imbalances between and within classes. Tuanfei Zhu et al successively proposed synthetic minority oversampling for multiclass imbalance (SMOM) [48] and synthetic minority oversampling for imbalanced ordinal regression (SMOR) [49]. SMOM is a k-NN based synthetic minority oversampling algorithm which assigns a selection weight to each neighbor direction.…”
Section: Approaches For Imbalanced Data Classificationmentioning
confidence: 99%