2018
DOI: 10.1007/s11265-018-1375-6
|View full text |Cite
|
Sign up to set email alerts
|

Hand Sign Recognition for Thai Finger Spelling: an Application of Convolution Neural Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
44
0
2

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 42 publications
(46 citation statements)
references
References 32 publications
0
44
0
2
Order By: Relevance
“…ASL is the foundation of Thai finger-spelling sign language (TFSL).The TFSL was invented in 1953 by Khunying Kamala Krairuek using American finger-spelling as a prototype to represent the 42 Thai consonants, 32 vowels, and 6 intonation marks [2]. All forty-two Thai letters can be presented with a combination of twenty-five hand gestures.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…ASL is the foundation of Thai finger-spelling sign language (TFSL).The TFSL was invented in 1953 by Khunying Kamala Krairuek using American finger-spelling as a prototype to represent the 42 Thai consonants, 32 vowels, and 6 intonation marks [2]. All forty-two Thai letters can be presented with a combination of twenty-five hand gestures.…”
Section: Introductionmentioning
confidence: 99%
“…Deep learning is a tool that is increasingly being used in sign language recognition [2,9,10], face recognition [11], object recognition [12], and others. This technology is used for solving complex problems such as object detection [13], image segmentation [14], and image recognition [12].…”
Section: Introductionmentioning
confidence: 99%
“…A variety of methods for the analysis of sensor data [1]- [4] and the extraction of meaningful patterns from these data have been proposed in recent decades [5]. Data collected by various sensors such as image, voice, electromyography (EMG) and chemical sensors are used for different applications such as image recognition [6]- [8], speech recognition [9], [10], gesture recognition [11]- [14] and gas classification [15]- [20]. The performance of classification techniques using sensor data varies greatly depending not only on the amount of data collected but also on the quality of the data.…”
Section: Introductionmentioning
confidence: 99%
“…This task presents a challenging problem that has not yet been solved in computer vision and machine learning. Unlike many previous studies [1,2,3,4,5,6,7], which separately have tried to address hand detection or hand gesture recognition, our approach attempts to jointly solve the problem of hand localization and gesture recognition. This task, however, is very challenging, due to the significant variations of hand images in realistic scenarios.…”
Section: Introductionmentioning
confidence: 99%
“…This problem, however, represents a high level of complexity, and retrieving the hand shape is difficult, due to the vast number of hand configurations and variations of the viewpoint with respect to the image sensor. Furthermore, recognizing static hand gestures plays an important role in many applications, such as sign language recognition for deaf and speech-impaired people [6,7], driver hand monitoring and hand gesture commands in order to reduce driver distraction [4,18], an alternative input method for interfacing between human and machines [2,8], in-air writing interaction [19], hand-object interaction in augmented and visual reality environments [20], and many other applications.…”
Section: Introductionmentioning
confidence: 99%