2022
DOI: 10.1109/access.2022.3209239
|View full text |Cite
|
Sign up to set email alerts
|

Unsupervised Cross-Domain Person Re-Identification Method Based on Attention Block and Refined Clustering

Abstract: Most unsupervised cross-domain person re-identification methods based on clustering suffer from a lack of feature discrimination and clustering generates pseudo-labels noise, leading to a decrease in accuracy. To solve these problems, this paper proposes an unsupervised cross-domain person reidentification method based on attention block and refined clustering. Firstly, ResNet50 is selected as the backbone network, coordinate attention and triple attention are concatenated and embedded in ResNet50 to extract f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 32 publications
0
1
0
Order By: Relevance
“…Numerous influential works have contributed to the progress in this field. An unsupervised cross-domain person re-identification method based on attention blocks and refined clustering was proposed in [25] to address the issues of feature discrimination and pseudo-label noise. Another work, "Eliminating Background-Bias," tackled background interference through foreground-aware feature learning [26].…”
Section: Unsupervised Person Re-identificationmentioning
confidence: 99%
“…Numerous influential works have contributed to the progress in this field. An unsupervised cross-domain person re-identification method based on attention blocks and refined clustering was proposed in [25] to address the issues of feature discrimination and pseudo-label noise. Another work, "Eliminating Background-Bias," tackled background interference through foreground-aware feature learning [26].…”
Section: Unsupervised Person Re-identificationmentioning
confidence: 99%