2021
DOI: 10.1007/s10489-021-02671-1
|View full text |Cite
|
Sign up to set email alerts
|

One-class ensemble classifier for data imbalance problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 32 publications
(8 citation statements)
references
References 35 publications
0
8
0
Order By: Relevance
“…Mohammad et al [29] proposed a clustering-based under-sampling method where they used C4.5 bagging and boosting approach for primary classification of pre-processed data, K-Means algorithm to cluster the data, and also Mahalanobis distance is introduced into their proposed method for the purpose of preserving the data distribution pattern in each cluster. Moreover, Hayashi et al [30] proposed a single class ensemble classifier where the model learning separately from majority and minority class, and final classification result is the combination of knowledge from both classes' training.…”
Section: Methods Of Imbalance Classificationmentioning
confidence: 99%
“…Mohammad et al [29] proposed a clustering-based under-sampling method where they used C4.5 bagging and boosting approach for primary classification of pre-processed data, K-Means algorithm to cluster the data, and also Mahalanobis distance is introduced into their proposed method for the purpose of preserving the data distribution pattern in each cluster. Moreover, Hayashi et al [30] proposed a single class ensemble classifier where the model learning separately from majority and minority class, and final classification result is the combination of knowledge from both classes' training.…”
Section: Methods Of Imbalance Classificationmentioning
confidence: 99%
“…Then, testing samples are predicted based on all classifiers in the manner of ensemble learning [20]. Such kind of strategy is called one-class ensemble [21][22][23], which is promising research in detecting unseen samples. Moreover, one-class ensemble is effective for data imbalance problem.…”
Section: One-class Classificationmentioning
confidence: 99%
“…Moreover, one-class ensemble is effective for data imbalance problem. Since the models are trained from one class, data balance is not a problem [23]. Such an approach is applicable in many situations.…”
Section: One-class Classificationmentioning
confidence: 99%
See 1 more Smart Citation
“…For example, it often happens in classification models to have high accuracy but a low recall rate. This is because the trained model does not consider data set imbalances carefully [37], and the ability to manage data imbalances is insufficient. In order to compensate for this shortcoming, different criteria for proving the efficiency of the model from various perspectives are required.…”
Section: Introductionmentioning
confidence: 99%