2022
DOI: 10.3390/app12041796
|View full text |Cite
|
Sign up to set email alerts
|

Evaluation of Vision-Based Hand Tool Tracking Methods for Quality Assessment and Training in Human-Centered Industry 4.0

Abstract: Smart industrial workstations for the training and evaluation of workers are an innovative approach to face the problems of manufacturing quality assessment and fast training. However, such products do not implement algorithms that are able to accurately track the pose of a hand tool that might also be partially occluded by the operator’s hands. In the best case, the already proposed systems roughly track the position of the operator’s hand center assuming that a certain task has been performed if the hand cen… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 11 publications
(9 citation statements)
references
References 36 publications
0
9
0
Order By: Relevance
“…Related approaches that identify the orientation of human body poses may be used in cases of Human–Robot Collaboration for real-time decision making and path planning to carry out tasks. Similarly, De Feudis et al [ 43 ] assessed four different vision systems for hand tool pose estimation: ArUco, OpenPose, Azure Kinect Body Tracking, and YOLO network were used with HTC Vive as a benchmarking system. Further, in a study presented in [ 44 ], Azure Kinect and Intel RealSense D435i were compared where the Intel RealSense was reported to show poorer performance in the estimation after 2 m, while the Azure Kinect performed better.…”
Section: State Of the Art Reviewmentioning
confidence: 99%
See 1 more Smart Citation
“…Related approaches that identify the orientation of human body poses may be used in cases of Human–Robot Collaboration for real-time decision making and path planning to carry out tasks. Similarly, De Feudis et al [ 43 ] assessed four different vision systems for hand tool pose estimation: ArUco, OpenPose, Azure Kinect Body Tracking, and YOLO network were used with HTC Vive as a benchmarking system. Further, in a study presented in [ 44 ], Azure Kinect and Intel RealSense D435i were compared where the Intel RealSense was reported to show poorer performance in the estimation after 2 m, while the Azure Kinect performed better.…”
Section: State Of the Art Reviewmentioning
confidence: 99%
“…The experimentation involved three different motion scenarios of a human operator handling a cordless drill with its mandrel considered as the point of interest to be tracked [ 43 ]. The mean square point-to-point distance (D.RMS) and the multivariate R 2 were used as the accuracy evaluation criteria.…”
Section: State Of the Art Reviewmentioning
confidence: 99%
“…In terms of quality control analysis, AR has emerged as a valuable tool, allowing for more accurate and detailed assessment. A study by De Feudis et al [78] explored the application of AR in quality assessment, making use of hand tool tracking through machine vision techniques. AR, with its capacity to provide real-time visualisations, is distinguished by its adaptability and flexibility, enabling the constant optimisation of quality protocols and the agile adaptation of quality standards to the changing dynamics of the industrial sector.…”
Section: Ar For Manufacturing Process Analysismentioning
confidence: 99%
“…The Body Tracking SDK is built on a complicated deep learning model that allows body segmentation, human skeleton reconstruction, human body instance detection, and real-time body tracking. A power drill chunk pose can be best estimated by looking at the n.8 (left hand) and n.15 (right hand) kinect hand joints, according to [21]. As a result of multiple testing, these joints were ruled out due of their poor orientation predictions.…”
Section: Hand Detection and Recognitionmentioning
confidence: 99%