2024
DOI: 10.21203/rs.3.rs-3920683/v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Learn from Restoration: Exploiting Oriented Knowledge Distillation in Self-Supervised Learning for Person Re-Identification

Enze Yang,
Yuxin Liu,
Shitao Zhao
et al.

Abstract: Person Re-IDentification (person ReID) aims to identify such individuals across diverse surveillance scenarios, which plays a pivotal role in centralized monitoring. However very recent studies pre-trained their models on ImageNet and fine-tuned on specific downstream ReID dataset, which leads to restricted generalization capability of person centered tasks. Addressing this limitation, this paper introduces a Self-Supervised Learning (SSL) model with Oriented Token Prior guided Knowledge Distillation. The prop… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 39 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?