2022
DOI: 10.1007/978-3-031-16437-8_21
|View full text |Cite
|
Sign up to set email alerts
|

Flat-Aware Cross-Stage Distilled Framework for Imbalanced Medical Image Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(1 citation statement)
references
References 19 publications
0
0
0
Order By: Relevance
“…Ozturk et al proposed to decouple the learning of features and the classifier: this method uses the deep clustering method to obtain features with maximum class separation and then learns the classifier by keeping the class marginal [42]. Similarly, Li et al proposed cross-staged distilling method to prevent the classifier from being biased based on the learned features [67]. The attention mechanism was also leveraged to exploit class-agnostic global attention feature maps for the imbalanced medical data [68].…”
Section: Class Imbalancementioning
confidence: 99%
“…Ozturk et al proposed to decouple the learning of features and the classifier: this method uses the deep clustering method to obtain features with maximum class separation and then learns the classifier by keeping the class marginal [42]. Similarly, Li et al proposed cross-staged distilling method to prevent the classifier from being biased based on the learned features [67]. The attention mechanism was also leveraged to exploit class-agnostic global attention feature maps for the imbalanced medical data [68].…”
Section: Class Imbalancementioning
confidence: 99%