2014
DOI: 10.1007/978-3-319-11179-7_66
|View full text |Cite
|
Sign up to set email alerts
|

Empowering Imbalanced Data in Supervised Learning: A Semi-supervised Learning Approach

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 13 publications
0
3
0
Order By: Relevance
“…The rise of the likelihood for overfitting is the main drawback of random over-sampling techniques due to the replicating of minority instances (Almogahed and Kakadiaris, 2014[8]). Chawla et al (2002[23]) proposed the Synthetic Minority Oversampling Technique (SMOTE) which is done by creating synthetic examples rather than by over-sampling with replacement.…”
Section: Cornerstones Of a Cad Systemmentioning
confidence: 99%
See 1 more Smart Citation
“…The rise of the likelihood for overfitting is the main drawback of random over-sampling techniques due to the replicating of minority instances (Almogahed and Kakadiaris, 2014[8]). Chawla et al (2002[23]) proposed the Synthetic Minority Oversampling Technique (SMOTE) which is done by creating synthetic examples rather than by over-sampling with replacement.…”
Section: Cornerstones Of a Cad Systemmentioning
confidence: 99%
“…The minority class is over-sampled by taking each minority class sample and introducing synthetic examples along the line segments joining any/all of the k minority class nearest neighbours. SMOTE is an effective oversampling technique which has some deficiency such as over-generation because the generation of synthetic samples increases the classes overlapping (Almogahed and Kakadiaris, 2014[8]). Over-generation is problematic in the case of skewed class distribution with sparse minority class versus majority class (Maciejewski and Stefanowski, 2011[90]) .…”
Section: Cornerstones Of a Cad Systemmentioning
confidence: 99%
“…However, it has been argued that random under-sampling may lose some relevant information, while randomly over-sampling with replacement the smallest class may lead to overfitting (Almogahed and Kakadiaris 2014). More sophisticated sampling techniques may allow to avoid these drawbacks.…”
Section: Introductionmentioning
confidence: 99%