Proceedings of the 4th International Workshop on Sensor-Based Activity Recognition and Interaction 2017
DOI: 10.1145/3134230.3134236
|View full text |Cite
|
Sign up to set email alerts
|

Real-time Embedded Recognition of Sign Language Alphabet Fingerspelling in an IMU-Based Glove

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 22 publications
(7 citation statements)
references
References 16 publications
0
7
0
Order By: Relevance
“…In the mobile and ubiquitous computing community, there have been efforts to exploit sensing platforms for sign language translation [14,34,37,39,44,57,60]. These previous works use devices such as RGB cameras [5,21,27,29,33,35,46,54,55], motion sensors (e.g., Leap Motion) [14,41], depth cameras/sensors (e.g., Kinect) [6,10,11,16,38,48,51], or electromyogram (EMG) sensors [53,57] to capture user hand motions and combine sensing results with various machine learning models to infer the word being expressed.…”
Section: :2 • Park Et Almentioning
confidence: 99%
See 1 more Smart Citation
“…In the mobile and ubiquitous computing community, there have been efforts to exploit sensing platforms for sign language translation [14,34,37,39,44,57,60]. These previous works use devices such as RGB cameras [5,21,27,29,33,35,46,54,55], motion sensors (e.g., Leap Motion) [14,41], depth cameras/sensors (e.g., Kinect) [6,10,11,16,38,48,51], or electromyogram (EMG) sensors [53,57] to capture user hand motions and combine sensing results with various machine learning models to infer the word being expressed.…”
Section: :2 • Park Et Almentioning
confidence: 99%
“…Mohandes [37] designed a two-handed sign recognition system for Unified Arabic Sign Language using CyberGloves, and a support vector machine (SVM) was used as a classifier. Mummadi et al [39] designed a real-time recognition system for the French Sign Language alphabet using a custom-developed IMU sensor-based glove. In this work, the authors exploit a multi-layer perceptron (MLP) to classify between different alphabet letters.…”
Section: Related Workmentioning
confidence: 99%
“…Hand Gesture Recognition was previously explored in [1,4,5,6]. In [18] a similar glove prototype was built to recognize french sign language characters by estimating the rotation of each finger. A single-sensor glove prototype called GestGlove [19] was built to recognize a simple set of hand gestures allowing phone control.…”
Section: Related Workmentioning
confidence: 99%
“…The algorithms for HAR can be classified into shallow and deep learning methods. Common shallow methods in HAR include SVM [13], [20], [23], k-nearest neighbors (kNN) [16], [24], linear discriminant analysis (LDA) [9], and random forest (RF) [21]. Deep learning approaches, such as LSTM [7], [15], CNN-LSTM [25], [27], CNN [22], and convLSTM [26], have shown impressive leaps in performance compared to their shallow counterparts by learning to automatically extract features from raw sensor data, thus dropping the need for having human experts to provide hand-engineered features.…”
Section: Background a Human Activity Recognition (Har)mentioning
confidence: 99%