IEEE Africon '11 2011
DOI: 10.1109/afrcon.2011.6072114
|View full text |Cite
|
Sign up to set email alerts
|

Sign language recognition using the Extreme Learning Machine

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 16 publications
(4 citation statements)
references
References 9 publications
0
4
0
Order By: Relevance
“…Each gesture was mapped onto the corresponding alphabet letter by comparing three different classifiers, namely support vector machines, Mahalanobis distances and Euclidean distances, obtaining an accuracy of >90%. A complete set of 26 characters were classified, with an accuracy of 95%, by means of a glove equipped with fourteen RFSs and a neural network classifier, in [207]. A set of 31 static gestures were recognized, with a classification accuracy ranging between 77.42% and 99.61%, from a South African sign language dataset, adopting the Levenberg-Marquardt training algorithm and neural network classificatory [158].…”
Section: Gesture Recognition Assistance and Interpretationmentioning
confidence: 99%
“…Each gesture was mapped onto the corresponding alphabet letter by comparing three different classifiers, namely support vector machines, Mahalanobis distances and Euclidean distances, obtaining an accuracy of >90%. A complete set of 26 characters were classified, with an accuracy of 95%, by means of a glove equipped with fourteen RFSs and a neural network classifier, in [207]. A set of 31 static gestures were recognized, with a classification accuracy ranging between 77.42% and 99.61%, from a South African sign language dataset, adopting the Levenberg-Marquardt training algorithm and neural network classificatory [158].…”
Section: Gesture Recognition Assistance and Interpretationmentioning
confidence: 99%
“…As a result, the system had lower accuracy than the trained system. Sole et al used Extreme Learning Machine (ELC) algorithm to classify the Auslan (Australian Sign Language) alphabet [8]. The reason they chose ELC because it is a simplified version of the Neural Network (NN) [8].…”
Section: Cyber Glovesmentioning
confidence: 99%
“…The ELM-based artificial neural network was chosen by the present study due to its capacity to process real-time pattern recognition in applications [9], as well as the fact that it is used in gesture recognition systems [36] [35]. In [37], the ELM technique was compared to other machine learning techniques to recognize language when applied to LIBRAS sign language and demonstrated recognition rates very similar to state of the art results, except up to 5 times faster, as presented in Table 1 Training [37].…”
Section: System Architecture Using Elm Neural Networkmentioning
confidence: 99%