Communication difficulties are frequent for many people with severe motor disabilities, making it difficult for them to interact with their families, caregivers and society in general. Augmentative and Alternative Communication (AAC) then aims to compensate for the communication deficit of these people, providing the individual with a better quality of life. However, these individuals with severe neuromotor disorders who have severe movement restrictions find great challenges in the use of several current assistive technologies. In this context, the objective of this article is to present an Alternative Communication System based on Artificial Neural Networks with a user-centered approach and their needs, for use by this public. The input and processing of signals is performed by reading the facial reference points, using the MediaPipe FaceMesh library, and the development of the classifier of gestures/facial expressions is carried out by implementing a Recurrent Neural Network Model, using long memory units term (LSTM) and dense layers. Real-time experimental results indicate that the proposed system has a good performance, with an average accuracy of 91.8%, demonstrating recognized results in the recognition of registered gestures.