Sign Language is a visual and gestural language used by deaf and hard-of-hearing people to communicate. However, communication between hearing and non-hearing individuals can be challenging due to the language barrier. In order to overcome this barrier, a Mobile based Translator is being proposed using Convolutional Neural Network (CNN) to recognize and translate hand gestures in real-time. The proposed system consists of a CNN model trained on a large dataset of hand gestures that includes various signs, such as the alphabet, numbers, and common phrases to recognize various signs, and a backend server to handle the translation of the recognized signs into text. The system was implemented using various CNN architectures like ResNet, MobileNet and VGG-16, where the later gave the best accuracy of 97.69%. The trained VGG-16 model recognizes the signs by extracting features from the images of the hand gestures and using these features to classify the gestures. Once a gesture is recognized, the backend server translates it into text using a pre-defined mapping of signs to words or phrases. The translated text will then be displayed to the user in real-time on the mobile app, enabling seamless communication between hearing and non-hearing individuals. The proposed system was implemented as a mobile app using Flutter, which is a cross-platform development framework. This mobile app make communication easier for vulnerable and enable sharing of information without any discrimination.