2020
DOI: 10.48550/arxiv.2010.09084
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Gait Recognition using Multi-Scale Partial Representation Transformation with Capsules

Abstract: Gait recognition, referring to the identification of individuals based on the manner in which they walk, can be very challenging due to the variations in the viewpoint of the camera and the appearance of individuals. Current methods for gait recognition have been dominated by deep learning models, notably those based on partial feature representations. In this context, we propose a novel deep network, learning to transfer multi-scale partial gait representations using capsules to obtain more discriminative gai… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 25 publications
0
2
0
Order By: Relevance
“…GLN [22] adopts the Triplet loss and Cross-Entropy loss in the different training stages. CapsNet [14] utilizes these two loss functions on its different components of the network.…”
Section: Related Work a Gait Recognitionmentioning
confidence: 99%
“…GLN [22] adopts the Triplet loss and Cross-Entropy loss in the different training stages. CapsNet [14] utilizes these two loss functions on its different components of the network.…”
Section: Related Work a Gait Recognitionmentioning
confidence: 99%
“…However, when using this method to express the silhouettes in gait sequence, it often loses a lot of important detailed information and increases the difficulty of recognition. And other model-based methods also encounter this problem, so appearance-based methods have become the most mainstream gait recognition methods at present [6], [31], [32], [7], [33], [34], [35]. In this paper, the methods we mentioned later both belong to the appearance-based methods.…”
Section: Related Workmentioning
confidence: 99%