2021
DOI: 10.1016/j.patrec.2021.07.017
|View full text |Cite
|
Sign up to set email alerts
|

Imbalanced image classification with complement cross entropy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 58 publications
(15 citation statements)
references
References 14 publications
0
12
0
1
Order By: Relevance
“…The loss function is the main part of training a neural network model and is used to adjust the weights of the neural network [60]. As a result of processing training examples by a neural network, output responses are generated that indicate the probability or reliability of possible categories to which the analyzed data belong [61]. The resulting probabilities are compared with the true labels.…”
Section: Modification Of the Cross-entropy Loss Function Using Weight...mentioning
confidence: 99%
“…The loss function is the main part of training a neural network model and is used to adjust the weights of the neural network [60]. As a result of processing training examples by a neural network, output responses are generated that indicate the probability or reliability of possible categories to which the analyzed data belong [61]. The resulting probabilities are compared with the true labels.…”
Section: Modification Of the Cross-entropy Loss Function Using Weight...mentioning
confidence: 99%
“…For classification sub-networks (g cls and h cls ) and localization sub-networks (g loc and h loc ), a fully convolutional network is employed, consisting of four times 3 × 3 on 256 feature size convolutional layers with the same padding and PReLU [52] activation. Each sub-network is trained with CCE loss [56] for classification and L1 smooth loss [2] for 4-axis box coordinates regression. The experimental results are presented in the following sections.…”
Section: A Implementation Detailsmentioning
confidence: 99%
“…In 2019, Cui et al [ 18 ] proposed the class-balanced loss framework, using the effective number of samples per class to inversely weight the loss, and this method can effectively solve the class number imbalance problem. Kim et al [ 19 ] proposed complement cross-entropy (CCE) to solve the problem of imbalanced classification and proved that suppressing the probability of the error class helps the deep learning model to learn discriminative information. By neutralizing the confidence of the error sample, the minority class samples get more learning opportunities.…”
Section: Problem Statement and Backgroundmentioning
confidence: 99%