2021 IEEE/CVF International Conference on Computer Vision (ICCV) 2021
DOI: 10.1109/iccv48922.2021.00136
|View full text |Cite
|
Sign up to set email alerts
|

Towards Learning Spatially Discriminative Feature Representations

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 20 publications
(11 citation statements)
references
References 26 publications
0
9
0
Order By: Relevance
“…The current method assumes a simple fracture presence and we plan to generalize it to deal with multiple, complicated, and compound fracture presences. In addition, it will be interesting to see how human annotation can be further saved in light of [51], [52]. (c) demonstrates the network attention difference between the ResNets trained with attention guidance or without (i.e., vanilla CAM).…”
Section: Discussionmentioning
confidence: 99%
“…The current method assumes a simple fracture presence and we plan to generalize it to deal with multiple, complicated, and compound fracture presences. In addition, it will be interesting to see how human annotation can be further saved in light of [51], [52]. (c) demonstrates the network attention difference between the ResNets trained with attention guidance or without (i.e., vanilla CAM).…”
Section: Discussionmentioning
confidence: 99%
“…CCKD [31] proposed to transfer the correlation between input instances. CCM [38] proposed to match the class activation map of teacher with the class-agnostic activation map of student. In a survey of knowledge distillation [11], they discussed different forms of knowledge in three categories, including response-based [14], [26], feature-based [1], [32], [38], [43], and relation-based [31], [34], [36], [41].…”
Section: Related Work a Knowledge Distillationmentioning
confidence: 99%
“…CCM [38] proposed to match the class activation map of teacher with the class-agnostic activation map of student. In a survey of knowledge distillation [11], they discussed different forms of knowledge in three categories, including response-based [14], [26], feature-based [1], [32], [38], [43], and relation-based [31], [34], [36], [41]. However, the transferred knowledge mentioned in the existing methods always comes from pretrained teacher networks, ignoring the knowledge generated in the training process of the teacher, which we call teacher's experience.…”
Section: Related Work a Knowledge Distillationmentioning
confidence: 99%
See 1 more Smart Citation
“…Many works utilized Grad-CAM not only as a powerful tool for offline model analysis but as an embedded component in the designed deep learning model for various applications. For example, one notable capability of CAM is the target localization of a model trained only with image labels; therefore, it prevails in weakly supervised tasks, such as segmentation [4,13,16,43] and detection [38,45,50], or even knowledge distillation [39].…”
Section: Grad-based Class Activation Mapmentioning
confidence: 99%