2016 International Conference on Computer Communication and Informatics (ICCCI) 2016
DOI: 10.1109/iccci.2016.7479951
|View full text |Cite
|
Sign up to set email alerts
|

K-nearest correlated neighbor classification for Indian sign language gesture recognition using feature fusion

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
16
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 65 publications
(16 citation statements)
references
References 5 publications
0
16
0
Order By: Relevance
“…The preprocessing operations include resizing, converting, filtering, enhancing, and segmenting images. For instance, some techniques such as skin color detection and filtering, RGB to HSV, converting to gray scale, and hand segmentation are generally introduced [9,10]. Moreover, preprocessing is also performed during the testing phase to obtain the region of interest (ROI).…”
Section: Preprocessingmentioning
confidence: 99%
“…The preprocessing operations include resizing, converting, filtering, enhancing, and segmenting images. For instance, some techniques such as skin color detection and filtering, RGB to HSV, converting to gray scale, and hand segmentation are generally introduced [9,10]. Moreover, preprocessing is also performed during the testing phase to obtain the region of interest (ROI).…”
Section: Preprocessingmentioning
confidence: 99%
“…In India, major research is focused on regional versions or manual components of signs. [6][7][8][9][10][11] reported work on dynamic hand gesture recognition for their dataset having limited number of gestures for single and two-hands manual signs. For Indian sign language translation, [12] and [13] proposed algorithms to convert ISL sentences in English text but they used traditional rule-based approach for sign translation due to limited size of their datasets.…”
Section: Introductionmentioning
confidence: 99%
“…Hence, this study attains 95.0% accuracy for recognition of Handicraft-Gesture data set and 89.5% accuracy for data set of Leap Motion gesture. Gupta et al (2016) has presented a methodology for recognizing hand gestures continuously via a three axis accelerometer and gyroscope sensors incorporated with a smart device. The inconstant impact of a hand showing a gesture can be decreased by proposing an algorithm for gesture coding.…”
Section: Introductionmentioning
confidence: 99%