2014
DOI: 10.21307/ijssis-2017-647
|View full text |Cite
|
Sign up to set email alerts
|

Study Of Vision Based Hand Gesture Recognition Using Indian Sign Language

Abstract: Human Computer Interaction moves forward in the field of sign language interpretation. Indian Sign Language (ISL) Interpretation system is a good way to help the Indian hearing impaired people to interact with normal people with the help of computer. As compared to other sign languages, ISL interpretation has got less attention by the researcher. In this paper, some historical background, need, scope and concern of ISL are given. Vision based hand gesture recognition system have been discussed as hand plays vi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 44 publications
(9 citation statements)
references
References 41 publications
0
9
0
Order By: Relevance
“…Kanika Rastogi and Pankaj Bhardwaj [8], explained the usage of the smart glove in the fingers of deaf and dumb with which the sign language is converted into text.…”
Section: Review Of Literaturementioning
confidence: 99%
“…Kanika Rastogi and Pankaj Bhardwaj [8], explained the usage of the smart glove in the fingers of deaf and dumb with which the sign language is converted into text.…”
Section: Review Of Literaturementioning
confidence: 99%
“…According to the information collection modes by the input devices [14], it can be roughly divided into two categories: vision-based recognition and sensor-based recognition. In general, vision-based recognition has been studied extensively for human interaction, which usually adopts one or more video cameras to capture and recognize arm motion trace; please refer to literatures [10,[15][16][17][18][19] for more details. Sensor-based recognition [20] uses different sensors (e.g., accelerometer [21], gyroscope [22], and body-worn sensors [23,24]) to perceive position and orientation data and translate the data into coordinates and angles.…”
Section: Related Workmentioning
confidence: 99%
“…In the sign language recognition [4], the hand segmentation is a key task in the gesture recognition process, that's why some authors use HSV [5] color model or YCbCr [1]. Currently, there are techniques related to Machine Learning such as Hidden Markov Models [21] or Neural Network [9], which are used in the recognition process.…”
Section: Introductionmentioning
confidence: 99%