Sign language is a very crucial aspect in the lives of those who cannot speak or listen to those around them. People with these disabilities have difficulty communicating with the outside world and feel left behind. Much research is ongoing to create a better way of communicating for these people. This work establishes interaction between hearing or speech impaired with the world by recognizing the 33-hand pose and gestures of Indian Sign Language (ISL). This framework can recognize alphabets and numbers in real-time and also generate gestures in real-time for the given alphabets and numbers. The fine-tuned Convolutional Neural Network (CNN) model is explored for the recognition of alphabets and numbers in real-time. A GUI is developed for an easy-to-use interface and immediate visual feedback. Data acquisition software is also developed to create a database. A database of 74,200 images of 33 static signs is captured and used in this work. The results are evaluated on different CNN architectures and learning rates. Accuracy, precision, recall, and F-score are used as performance metrics. The proposed work accomplished the most noteworthy training precision of 99.97% and a validation accuracy of 99.59%.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.