2021 IEEE International Conference on Big Data (Big Data) 2021
DOI: 10.1109/bigdata52589.2021.9671807
|View full text |Cite
|
Sign up to set email alerts
|

HAR: Hardness Aware Reweighting for Imbalanced Datasets

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(8 citation statements)
references
References 29 publications
0
8
0
Order By: Relevance
“…Note that, if we can increase the depth of our network and use more classifier branches, the model performance will be better. Moreover, the performance of MCCM dir is better than that of MCCM conf , which means that our designed sample difficulty discrimination module is better than that proposed in ELF ( Duggal et al, 2020 ). The Dirichlet distribution united with the information entropy can divide the samples into hard and easy ones, which is better than using the simple classifier confidience.…”
Section: Resultsmentioning
confidence: 86%
See 4 more Smart Citations
“…Note that, if we can increase the depth of our network and use more classifier branches, the model performance will be better. Moreover, the performance of MCCM dir is better than that of MCCM conf , which means that our designed sample difficulty discrimination module is better than that proposed in ELF ( Duggal et al, 2020 ). The Dirichlet distribution united with the information entropy can divide the samples into hard and easy ones, which is better than using the simple classifier confidience.…”
Section: Resultsmentioning
confidence: 86%
“…MCCM conf : the variant is designed based on the ELF ( Duggal et al, 2020 ), which uses the classifier confidence to distinguish the samples into easy and hard ones. Particularly, in the training process, if the samples are classified correctly and their classifier confidence is lower than the threshold (0.9 used in ELF and this study), they will not be sent to the hard classifier.…”
Section: Resultsmentioning
confidence: 99%
See 3 more Smart Citations