2020
DOI: 10.1109/jstars.2020.2995703
|View full text |Cite
|
Sign up to set email alerts
|

A Class Imbalance Loss for Imbalanced Object Recognition

Abstract: The class imbalance problem exists widely in vision data. In these imbalanced datasets, the majority classes dominate the loss and influence the gradient. Hence, these datasets have a significantly negative impact on the performance of many stateof-the-art methods. In this article, we propose a class imbalance loss (CI loss) to handle this problem. To distinguish imbalanced datasets in accordance with the extent of imbalance, we also define an imbalance degree that works as a decision index factor in the CI lo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 36 publications
(14 citation statements)
references
References 41 publications
0
14
0
Order By: Relevance
“…The aim was to compare the performance of BBW with six popular imbalanced data processing methods, using the same datasets and the same underlying model: down sampling [3], oversampling [4], class weights [1], focal loss [9], weighted softmax loss (WSL) [11], and class imbalance loss (CIL) [12]. All three datasets were used: CHB-MIT, BonnEEG and FAHXJU.…”
Section: Experiments 3 -Comparison Of Bbw With Existing Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…The aim was to compare the performance of BBW with six popular imbalanced data processing methods, using the same datasets and the same underlying model: down sampling [3], oversampling [4], class weights [1], focal loss [9], weighted softmax loss (WSL) [11], and class imbalance loss (CIL) [12]. All three datasets were used: CHB-MIT, BonnEEG and FAHXJU.…”
Section: Experiments 3 -Comparison Of Bbw With Existing Methodsmentioning
confidence: 99%
“…Jia et al [11] propose weighted softmax loss, adaptively parameterized by maximum multi-class imbalance ratio. Zhang et al [12] devise class imbalance loss to improve the cross-entropy loss on imbalanced datasets.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…The final output of the proposed method is the mean of two probability distribution vectors. In this paper, the cross entropy loss [50] is used to calculate the error for backward propagation, and the stochastic gradient descent (SGD) [51] is used to optimize parameters. During the test process, parameters of the ASC model and parts images of each test target also need to be calculated at first.…”
Section: E Training and Test Processmentioning
confidence: 99%