2021
DOI: 10.3389/fnbot.2021.703545
|View full text |Cite
|
Sign up to set email alerts
|

Egocentric Gesture Recognition Using 3D Convolutional Neural Networks for the Spatiotemporal Adaptation of Collaborative Robots

Abstract: Collaborative robots are currently deployed in professional environments, in collaboration with professional human operators, helping to strike the right balance between mechanization and manual intervention in manufacturing processes required by Industry 4.0. In this paper, the contribution of gesture recognition and pose estimation to the smooth introduction of cobots into an industrial assembly line is described, with a view to performing actions in parallel with the human operators and enabling interaction… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2

Relationship

2
4

Authors

Journals

citations
Cited by 8 publications
(7 citation statements)
references
References 54 publications
0
7
0
Order By: Relevance
“…The objects involved in this TV assembly process are a TV frame and two circuit cards: the power supply (PSUgold card) and the mainboard (chassis-green card). An earlier study [24] demonstrated the network's ability to recognize the 11 gestures with a 98.5% accuracy.…”
Section: A Gesture Recognition With 3dcnnsmentioning
confidence: 98%
See 3 more Smart Citations
“…The objects involved in this TV assembly process are a TV frame and two circuit cards: the power supply (PSUgold card) and the mainboard (chassis-green card). An earlier study [24] demonstrated the network's ability to recognize the 11 gestures with a 98.5% accuracy.…”
Section: A Gesture Recognition With 3dcnnsmentioning
confidence: 98%
“…To summarize, highly adaptable and rapidly reconfigurable frameworks with real-time data processing capabilities are necessary to handle the industry's ergonomics challenges while simultaneously ensuring productivity. This paper expands on the authors' prior work on adaptive HRC [24], and automatic ergonomic monitoring [25], [26], [27], with the goal of creating a unified ergonomic and reconfigurable HRC framework.…”
Section: State Of the Artmentioning
confidence: 99%
See 2 more Smart Citations
“…In addition to human-computer interaction, the concept has also been extended to human-robot interaction. Papanagiotou et al (2021) proposed a multi-task approach including gesture recognition to enable human-robot collaboration on an industrial assembly line. The main component is represented by a gesture recognition module which is based on 3D CNN trained on egocentric data acquired with a GoPro camera.…”
Section: Hand Gesturesmentioning
confidence: 99%