“…Within the framework of this research, the results of such studies are of interest from the point of view of information visualization, particularly the accuracy of static and dynamic gestures, the position of the human hand, the features of a person's lip contour, etc. Instruction, July 2022 • Vol.15, No.3 If we consider the use of gestures as a way of interacting with a computer system, algorithms for converting and interpreting gesture information, as well as the corresponding software implementation, then the software tools created are associated with the use of a large number of sensors, sensors, joysticks, trackballs or touch screens, as well as complex control systems (Karpov, 2013;Myasoedova et al, 2020;Rahman et al, 2019;Ryumin et al, 2020). As Karpov notes in his works, "the currently widely used graphical and textual interfaces are focused on experienced users, and the available research practically does not touch upon the issues of human-machine communication for persons with disabilities" (Karpov, 2013).…”