2016
DOI: 10.5120/ijca2016909118
|View full text |Cite
|
Sign up to set email alerts
|

Sign Energy Images for Recognition of Sign Language at Sentence Level

Abstract: In this paper, the task of sign language recognition at sentence level is addressed. The idea of Sign Energy Image (SEI) and a method of extracting Fuzzy-Gaussian Local Binary Pattern (FzGLBP) features from SEI to characterize the sign are explored. The suitability of interval valued type symbolic data for efficient representation of signs in the knowledgebase is studied. A Chi-square proximity measure is used to establish matching between reference and test signs. A simple nearest neighbor classification tech… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 43 publications
0
3
0
Order By: Relevance
“…Another approach was to perform the SLR at a finger spelling level [13][14][15][16] where the recognition solely revolved around various hand shapes describing alphabets and numbers. SLR at a word level [17][18][19][20] and at a sentence level [20][21][22][23] further more increased the applicability of the recognition to the real world SLR problems which involves sequence of continuous hand gestures constituting sign language messages. On the other hand, some interesting attempts [24][25][26][27][28] considered facial expression as a part of recognition task.…”
Section: Related Workmentioning
confidence: 99%
“…Another approach was to perform the SLR at a finger spelling level [13][14][15][16] where the recognition solely revolved around various hand shapes describing alphabets and numbers. SLR at a word level [17][18][19][20] and at a sentence level [20][21][22][23] further more increased the applicability of the recognition to the real world SLR problems which involves sequence of continuous hand gestures constituting sign language messages. On the other hand, some interesting attempts [24][25][26][27][28] considered facial expression as a part of recognition task.…”
Section: Related Workmentioning
confidence: 99%
“…Since signs used by hearing impaired people are very abstract, the sign language recognition based on fingerspelling or word seems to be cumbersome and not effective. With this observation, recently only two attempts were reported to address the problem at sentence level [4,15,16]. Therefore, there is scope for many more attempts in this direction.…”
Section: Related Workmentioning
confidence: 99%
“…The research works reported for sign language recognition have addressed the task at finger spelling level [2,12,13,21,24,25], at word level [11,17,24] and at sentence level [4,15,16]. Some of the techniques proposed by the research community, which gained importance due to their performance are Ichetrichef moments [6], Gray level histogram [29], Sensor based glove technique [6,7,10,17], Hidden Morkov Models (HMM) [1], Hu moments and Electromyography (EMG) segmentation [1], Localized contour sequence [10], Size function [17], Transition-movement [5], Moment based size function [8], Convex chain coding and Basic chain code [28], Fourier descriptors [22], Grassman Covariance Matrix (GCM) [31], Fusion of appearance based and 5DT glove based features [19], Sparse Observation (SO) description [27].…”
Section: Related Workmentioning
confidence: 99%