2020
DOI: 10.48550/arxiv.2009.02189
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Imbalanced Image Classification with Complement Cross Entropy

Abstract: Recently, deep learning models have achieved great success in computer vision applications, relying on large-scale class-balanced datasets. However, imbalanced class distributions still limit the wide applicability of these models due to degradation in performance. To solve this problem, we focus on the study of cross entropy: it mostly ignores output scores on wrong classes. In this work, we discover that neutralizing predicted probabilities on incorrect classes helps improve accuracy of prediction for imbala… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 44 publications
0
2
0
Order By: Relevance
“…In addition, Chen proposed complement entropy loss [32], which suppresses misclassification and achieves error neutralization by exploiting the complement information of samples in this class, thus improving the confidence level. Because the minority class is richer in error information, Kim et al [33] used complement entropy to improve the class imbalance problem. However, the existing complement entropy loss only considers the improvement of accuracy from the perspective of neutralizing the error information, and the available training samples are still insufficient for a very small percentage of minority class samples, which can limit the effectiveness of complement entropy loss for models in performing error neutralization of minority classes.…”
Section: Manuscript Clear Copymentioning
confidence: 99%
“…In addition, Chen proposed complement entropy loss [32], which suppresses misclassification and achieves error neutralization by exploiting the complement information of samples in this class, thus improving the confidence level. Because the minority class is richer in error information, Kim et al [33] used complement entropy to improve the class imbalance problem. However, the existing complement entropy loss only considers the improvement of accuracy from the perspective of neutralizing the error information, and the available training samples are still insufficient for a very small percentage of minority class samples, which can limit the effectiveness of complement entropy loss for models in performing error neutralization of minority classes.…”
Section: Manuscript Clear Copymentioning
confidence: 99%
“…Here we create two class-imbalanced datasets with CIFAR-10 and CIFAR-100 to verify if CCT shows better robustness when this uniform prior assumption is violated. The imbalanced dataset is created by randomly sampling a certain number of samples from each class with a "linear" or "exponentional" rule (Kim et al, 2020). Specifically, given a dataset with C classes, for class l ∈ {1, 2, ..., C}, we randomly take samples with a proportion l C for "linear" rule and with a proportion exp( l C ) for "exponential" rule.…”
Section: Classification On Class-imbalanced Datasetsmentioning
confidence: 99%