2020
DOI: 10.1016/j.patrec.2020.05.020
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive learning of minority class prior to minority oversampling

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2021
2021
2025
2025

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 9 publications
(3 citation statements)
references
References 18 publications
0
3
0
Order By: Relevance
“…Regardless of whether it's the Borderline-SMOTE, Safe-level SMOTE, or ADASYN method, the absence of a specified sampling region for each minority class instance can lead to the generation of erroneous or noisy instances. Budapest et al proposed Learning Minority Class prior to Minority Oversampling (LMCMO) [28] and Adaptive Learning of Minority Class prior to Minority Oversampling (ALMCMO) [29] estimate the minority class spaces before generating synthetic minority points to avoid producing incorrect instances. SMOTE-ENN combines the SMOTE oversampling technique with the Edited Nearest Neighbors (ENN) undersampling technique [30].…”
Section: Related Workmentioning
confidence: 99%
“…Regardless of whether it's the Borderline-SMOTE, Safe-level SMOTE, or ADASYN method, the absence of a specified sampling region for each minority class instance can lead to the generation of erroneous or noisy instances. Budapest et al proposed Learning Minority Class prior to Minority Oversampling (LMCMO) [28] and Adaptive Learning of Minority Class prior to Minority Oversampling (ALMCMO) [29] estimate the minority class spaces before generating synthetic minority points to avoid producing incorrect instances. SMOTE-ENN combines the SMOTE oversampling technique with the Edited Nearest Neighbors (ENN) undersampling technique [30].…”
Section: Related Workmentioning
confidence: 99%
“…The classification accuracy results in the majority class will be better when compared to the minority class [4]. In datasets that experience class imbalance, there is a tendency that the minority class is a class with information that tends to be more interesting than the majority class [5]. Class imbalance handling can increase overall classification accuracy [6].…”
Section: Introductionmentioning
confidence: 99%
“…Algorithm modification is to improve the existing algorithm or classification paradigm to adapt to the learning of minority class. There are three classic techniques for data preprocessing, namely under-sampling [12], oversampling [10], [13], [14], [15] and mixed sampling, which are also common methods. Under-sampling is a reasonable censorship of the majority class, while over-sampling is an effective supplement to the minority class data.…”
Section: Introductionmentioning
confidence: 99%