2016
DOI: 10.1016/j.gaitpost.2016.04.004
|View full text |Cite
|
Sign up to set email alerts
|

Accuracy of KinectOne to quantify kinematics of the upper body

Abstract: Motion analysis systems deliver quantitative information, e.g. on the progress of rehabilitation programs aimed at improving range of motion. Markerless systems are of interest for clinical application because they are low-cost and easy to use. The first generation of the Kinect™ sensor showed promising results in validity assessment compared to an established marker-based system. However, no literature is available on the validity of the new 'Kinect™ for Xbox one' (KinectOne) in tracking upper body motion. Co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

3
36
1
2

Year Published

2016
2016
2022
2022

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 35 publications
(42 citation statements)
references
References 18 publications
3
36
1
2
Order By: Relevance
“…This is mainly due to a combination of the type of performed exercises (large motions with a relatively low dynamic) and their short duration, which allow also ACC or GYR estimates to have limited RMSE. The IMU estimates employing sensor fusion algorithms outperform the Kinect's output, though by a limited margin, revealing that both approaches (by sensors or cameras) have a good overall performance, with errors in the range of 3 to 8 degree for all the joint angles analyzed, which is consistent with existing literature [39,55,[59][60][61]68]. As expected, the exercises performed while wearing clothes show a slightly higher RMSE and deviations; however, they are consistent with the standard GA case.…”
Section: Resultssupporting
confidence: 88%
See 2 more Smart Citations
“…This is mainly due to a combination of the type of performed exercises (large motions with a relatively low dynamic) and their short duration, which allow also ACC or GYR estimates to have limited RMSE. The IMU estimates employing sensor fusion algorithms outperform the Kinect's output, though by a limited margin, revealing that both approaches (by sensors or cameras) have a good overall performance, with errors in the range of 3 to 8 degree for all the joint angles analyzed, which is consistent with existing literature [39,55,[59][60][61]68]. As expected, the exercises performed while wearing clothes show a slightly higher RMSE and deviations; however, they are consistent with the standard GA case.…”
Section: Resultssupporting
confidence: 88%
“…Focusing on human motion capture applications, the use of the Kinect v1 in such scenario was triggered by the release of reverse-engineered open-source drivers and tracking software [34] and then propelled by the release of the Microsoft's SDK [35]. The second-generation device and its updated algorithm have been validated further within the context of clinical motion analysis, with applications such as posture and balance evaluation [36,37], fall detection [38], rehabilitation exercises [39][40][41], and gait assessment [42][43][44]. Moreover, the usability of Kinect-based home rehabilitation systems has been investigated, providing insights on the user acceptance with good results and indications for future improvements [45,46].…”
Section: Low-cost Video Sensing: the Kinectmentioning
confidence: 99%
See 1 more Smart Citation
“…Previous publications showed, that Kinect V1 and V2 measurements of landmark angles [16, 20, 21] and length of body parts [22] derived from different movements may lack accuracy. Therefore, we focused on parameters based on single ‘stable’ landmarks with the exception of POCO, where foot landmarks were integrated into an anchor point for the sway vector.…”
Section: Discussionmentioning
confidence: 99%
“…Hence, many scientific studies have been conducted to evaluate the reliability and validity of the calculated skeleton joints [11,1924]. The majority of these studies employed marker-based motion capture systems as a golden standard.…”
Section: Introductionmentioning
confidence: 99%