2019
DOI: 10.1016/j.ijhcs.2019.03.011
|View full text |Cite
|
Sign up to set email alerts
|

Systematic literature review of hand gestures used in human computer interaction interfaces

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
66
0
1

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
5

Relationship

0
10

Authors

Journals

citations
Cited by 139 publications
(68 citation statements)
references
References 140 publications
(198 reference statements)
1
66
0
1
Order By: Relevance
“…These can be seen in the forms of video processing libraries and algorithms, an array of machine learning frameworks and architectures and to an extent, relevant corpora. Technology facilitating gesture-based interfaces, relevant also to the automated translation of sign languages, can be split into two overall groupings: those relying primarily on wearables such as gloves, rings, accelerometers, etc., and those relying primarily on a camera or sensor-based tracking, where users' hand gestures are recorded at a distance [3]. In the first group, there are solutions such as the one presented by Cheng and colleagues [4], where an accelerometer is used to obtain 3D information of the hand movement with the aim of controlling an entertainment robot.…”
Section: State-of-the-art In Gesture Recognition and Automated Sign Lmentioning
confidence: 99%
“…These can be seen in the forms of video processing libraries and algorithms, an array of machine learning frameworks and architectures and to an extent, relevant corpora. Technology facilitating gesture-based interfaces, relevant also to the automated translation of sign languages, can be split into two overall groupings: those relying primarily on wearables such as gloves, rings, accelerometers, etc., and those relying primarily on a camera or sensor-based tracking, where users' hand gestures are recorded at a distance [3]. In the first group, there are solutions such as the one presented by Cheng and colleagues [4], where an accelerometer is used to obtain 3D information of the hand movement with the aim of controlling an entertainment robot.…”
Section: State-of-the-art In Gesture Recognition and Automated Sign Lmentioning
confidence: 99%
“…Hand gesture and/or pose tracking was required to perform at least one of the following: classify static gesture categories, estimate dynamic finger flexing angles, or classify hand movement trajectories. Articles based on computer vision methods or based on data gloves for hand gesture recognition were excluded as these have been the topic of recent reviews [8], [19], [20]. In Section II, we provide an overview of applications of wearable hand gesture recognition based on hand function.…”
Section: Introductionmentioning
confidence: 99%
“…Users are almost always connected to computer interfaces in their everyday lives [1]. Accordingly, such user interactions for computerbased applications have evolved in diverse ways, for example, from visual interaction [2] and gesture interaction [3], to voice interaction [4] and motion capture [5]. ese interaction ways or methods are collectively referred to as interaction modalities [6][7][8].…”
Section: Introductionmentioning
confidence: 99%