2017
DOI: 10.1016/j.neucom.2016.08.132
|View full text |Cite
|
Sign up to set email alerts
|

A multimodal framework for sensor based sign language recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
80
0
1

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
4
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 169 publications
(81 citation statements)
references
References 26 publications
0
80
0
1
Order By: Relevance
“…It can generate various hand and finger motion tracking information, such as the palm position, velocity, normal, direction, and grab strength, as well as the fingertip position, velocity, direction, length, width, and finger extended state. Due to its small observation area and high resolution, Leap Motion Controller has been widely used as a human-machine interface in the context of robot control [17], sign-language recognition [32], and text segmentation [33].…”
Section: Leap Motion Controllermentioning
confidence: 99%
“…It can generate various hand and finger motion tracking information, such as the palm position, velocity, normal, direction, and grab strength, as well as the fingertip position, velocity, direction, length, width, and finger extended state. Due to its small observation area and high resolution, Leap Motion Controller has been widely used as a human-machine interface in the context of robot control [17], sign-language recognition [32], and text segmentation [33].…”
Section: Leap Motion Controllermentioning
confidence: 99%
“…Diversos estudos demonstram a usabilidade e robustez do Leap Motion Controller. Alguns estudos discorrem sobre a precisão de rastreamento [21], [22]; outros sobre os seus benefícios em relação a outros sistemas de rastreamento [23], programas para o reconhecimento de linguagens de sinais [24], aplicações em jogos [25], medicina [26], robótica [27], educação [20], [28], entre outros.…”
Section: Leap Motion Controllerunclassified
“…The use of constrained grammars and coloured gloves produced low error rates on both training and test data [13]. Using sensor devices, a multimodal framework is applied for isolated sign language translation [14]. The sensors are used to capture finger, palm positions and then the Hidden Markov Model (HMM) and Bidirectional Long Short-Term Memory Neural Network (BLSTM-NN) are used for classification purpose.…”
Section: Introductionmentioning
confidence: 99%