2021
DOI: 10.1109/tmi.2021.3098703
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Medical Image Segmentation Based on Knowledge Distillation

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
23
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
3
3

Relationship

0
9

Authors

Journals

citations
Cited by 105 publications
(36 citation statements)
references
References 38 publications
0
23
0
Order By: Relevance
“…The music signal feature extraction here is based on the steady-state signal. Therefore, before the music signal feature extraction, it is generally necessary to divide the frame ( Qin et al, 2021 ). In order to ensure that the information between the two signals during a seamless transition is not lost, it is necessary to ensure that the two signals have overlapping parts.…”
Section: Methodsmentioning
confidence: 99%
“…The music signal feature extraction here is based on the steady-state signal. Therefore, before the music signal feature extraction, it is generally necessary to divide the frame ( Qin et al, 2021 ). In order to ensure that the information between the two signals during a seamless transition is not lost, it is necessary to ensure that the two signals have overlapping parts.…”
Section: Methodsmentioning
confidence: 99%
“…Wen et al [20] propose a boundary-guided knowledge distillation method which assists the student network to align organ boundary features in teacher network. Qin et al [21] design a region affinity distillation method, and integrate the importance map for efficient liver tumor segmentation. However, these methods mostly concentrate on distilling the knowledge from certain layers, while do not notice the variations of pathological or organic semantic-aware representations from different layers.…”
Section: B Knowledge Distillationmentioning
confidence: 99%
“…Wen et al [20] propose a boundary-guided distillation method for organ segmentation. Qin et al [21] design a region affinity distillation method for efficient liver tumor segmentation. Tran et al [22] propose a new Light-weight Deformable Registration network that introduce a new adversarial learning with distilling knowledge.…”
Section: Introductionmentioning
confidence: 99%
“…Self-distillation explores the potential of knowledge distillation from a new perspective [31]. Similarly, self-distillation strategies can be summarized as extracting the attention map of the current layers and then transferring the knowledge to the previous layer [32].…”
Section: Self-distillation Learningmentioning
confidence: 99%