2023
DOI: 10.1109/tcsvt.2022.3218587
|View full text |Cite
|
Sign up to set email alerts
|

Pull & Push: Leveraging Differential Knowledge Distillation for Efficient Unsupervised Anomaly Detection and Localization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 19 publications
(2 citation statements)
references
References 39 publications
0
0
0
Order By: Relevance
“…Style distillation [32] aims to extract information on multiple styles from the teacher network's feature map, from which the student network can learn to improve its generalizability and reconstruct anomalous patterns. The pull and push [33] method addresses generalizability problems by maximizing pixel-wise discrepancies for anomalous regions and simultaneously minimizing discrepancies for normal regions between two networks.…”
Section: B Knowledge Distillation-based Anomaly Detectionmentioning
confidence: 99%
“…Style distillation [32] aims to extract information on multiple styles from the teacher network's feature map, from which the student network can learn to improve its generalizability and reconstruct anomalous patterns. The pull and push [33] method addresses generalizability problems by maximizing pixel-wise discrepancies for anomalous regions and simultaneously minimizing discrepancies for normal regions between two networks.…”
Section: B Knowledge Distillation-based Anomaly Detectionmentioning
confidence: 99%
“…Momentum-based approaches have considerably enriched the field of selfsupervised learning with the teacher-student (TE-ST) formula. Prior works have carefully exploited the importance of TE-ST discrepancy, such as various knowledge distillation techniques in traditional supervised learning [15], [30]- [34] or semi-supervised learning [35], [36]. However, in self-supervised learning paradigm, this behavior, i.e.…”
Section: Introductionmentioning
confidence: 99%