2014
DOI: 10.1007/978-3-319-05458-2_61
|View full text |Cite
|
Sign up to set email alerts
|

3D Gait Recognition Using Spatio-Temporal Motion Descriptors

Abstract: Abstract. We present a view independent algorithm for 3D human gait recognition. The identification of the person is achieved using motion data obtained by our markerless 3D motion tracking algorithm. We report its tracking accuracy using ground-truth data obtained by a markerbased motion capture system. The classification is done using SVM built on the proposed spatio-temporal motion descriptors. The identification performance was determined using 230 gait cycles performed by 22 persons. The correctly classif… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
27
0

Year Published

2016
2016
2019
2019

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 23 publications
(27 citation statements)
references
References 16 publications
0
27
0
Order By: Relevance
“…Figure 6 shows that our method achieves the best Method EER AUC MAP Ahmed [9] 0.2779 0.7811 0.7948 Andersson [10] 0.2858 0.7755 0.7996 Balazia 0.1674 0.8902 0.8907 Ball [11] 0.3949 0.6714 0.6931 Dikovski [13] 0.2273 0.8288 0.8287 Gavrilova [14] 0.1922 0.8521 0.8904 Jiang [15] 0.2393 0.8414 0.8497 Krzeszowski [16] 0.2287 0.8464 0.8593 Kumar [17] 0.3545 0.6528 0.6197 Kwolek [18] 0.1839 0.8796 0.8840 Preis [19] 0.4916 0.5225 0.5356 Sedmidubsky [20] 0.2823 0.7726 0.7558 Sinha [21] 0.2194 0.8289 0.8493 Table 4 EER, AUC and MAP derived from FAR/FRR, ROC and recall/precision, respectively, plotted in Figure 6.…”
Section: Results Of Comparative Evaluationmentioning
confidence: 99%
See 2 more Smart Citations
“…Figure 6 shows that our method achieves the best Method EER AUC MAP Ahmed [9] 0.2779 0.7811 0.7948 Andersson [10] 0.2858 0.7755 0.7996 Balazia 0.1674 0.8902 0.8907 Ball [11] 0.3949 0.6714 0.6931 Dikovski [13] 0.2273 0.8288 0.8287 Gavrilova [14] 0.1922 0.8521 0.8904 Jiang [15] 0.2393 0.8414 0.8497 Krzeszowski [16] 0.2287 0.8464 0.8593 Kumar [17] 0.3545 0.6528 0.6197 Kwolek [18] 0.1839 0.8796 0.8840 Preis [19] 0.4916 0.5225 0.5356 Sedmidubsky [20] 0.2823 0.7726 0.7558 Sinha [21] 0.2194 0.8289 0.8493 Table 4 EER, AUC and MAP derived from FAR/FRR, ROC and recall/precision, respectively, plotted in Figure 6.…”
Section: Results Of Comparative Evaluationmentioning
confidence: 99%
“…The databases of all surveyed research groups are either acquired by Kinect tracking a small area [9,13,16,17,18,19] or contain pre-calibrated skeletons of all participants [20]. Another problem is their size: The largest structural gait database currently available has 230 gait cycles of 22 walking subjects.…”
Section: Experimental Databasesmentioning
confidence: 99%
See 1 more Smart Citation
“…They are typically combinations of static body parameters (bone lengths, person's height) [18] with dynamic gait features such as step length, walking speed, joint angles and inter-joint distances [3,6,18,20], along with various statistics (mean, standard deviation or maximum) of their signals [5]. Clearly, these features are schematic and human-interpretable, which is convenient for visualizations and for intuitive understanding, but unnecessary for automatic gait recognition.…”
Section: Implementation Details Of Algorithmsmentioning
confidence: 99%
“…Gait samples are compared by a distance function of their covariance matrices. • Kwolek by Kwolek et al [18] processes signals of bone angles around all axes, the person's height and step length. The gait cycles are normalized to 30 frames.…”
Section: Implementation Details Of Algorithmsmentioning
confidence: 99%