This article analyzes the most famous sign languages, the correlation of sign languages, and also considers the development of a verbal robot hand gesture recognition system in relation to the Kazakh language. The proposed system contains a touch sensor, in which the contact of the electrical property of the user's skin is measured, which provides more accurate information for simulating and indicating the gestures of the robot hand. Within the framework of the system, the speed and accuracy of recognition of each gesture of the verbal robot are calculated. The average recognition accuracy was over 98%. The detection time was 3ms on a 1.9 GHz Jetson Nano processor, which is enough to create a robot showing natural language gestures. A complete fingerprint of the Kazakh sign language for a verbal robot is also proposed. To improve the quality of gesture recognition, a machine learning method was used. The operability of the developed technique for recognizing gestures by a verbal robot was tested, and on the basis of computational experiments, the effectiveness of algorithms and software for responding to a verbal robot to a voice command was evaluated based on automatic recognition of a multilingual human voice. Thus, we can assume that the authors have proposed an intelligent verbal complex implemented in Python with the CMUSphinx communication module and the PyOpenGL graphical command execution simulator. Robot manipulation module based on 3D modeling from ABB.
In this work, the direct and inverse kinematic analysis of both robot arms are investigated based on the analytical and informational representation. The results of the study will be used to provide the functionality of gesturing by a robot in sign language, both Kazakh and other languages, used in educational systems, especially in children's institutions and societies for deaf people. A simulation model of the movement of the robot's arms in the workspace has been studied and built. The developed model will be further implemented and used as mathematical and information support for the created robot. The developed library contains implementations of forward kinematics and iterative algorithms for inverse kinematics. The InMoov robot is a platform widely used in research tasks, supported by the MyRobotLab package. A direct kinematic model for the left and right hands of the robot has been studied. Based on the Python programming language, the working space for robot manipulators was calculated, using the matpilotlib library, an iteration method algorithm was developed to find the probable path of robot manipulators in space. A model of a structured artificial neural network (ANN) is proposed, which is used to find a solution to the inverse kinematics of the InMoov robot with six degrees of freedom (4-dof). The applied ANN model is a multilayer perceptron neural network (MLPNN) in which the learning rule of the Adam-a gradient diskend type is applied. To solve this problem, the problem of finding the best ANN configuration was studied. It has been established that a multilayer parseptron neural network gives the minimum mean square error. The regression coefficient analysis, which shows a 95.6 % fit of all communication variables, is acceptable for obtaining the inverse kinematics of the InMoov robot.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.