These Human facial expressions communicate much information more visually than verbally. Recognition of facial expressions is essential for human-machine interaction. Applications for automatic facial expression recognition include but are not limited to, comprehending human behaviour, spotting mental illnesses, and creating artificial human emotions. Humancomputer interaction, autonomous cars, and a wide range of multimedia applications all rely heavily on facial expression detection. In this study, we present a reusable architecture for recognizing human emotions based on facial expressions. Two machine learning algorithms that may be learned in advance for use in real-time applications make up the framework. At first, we use AdaBoost cascade classifiers to find instances of faces in the photos. The extracted neighborhood difference characteristics then serve as a representation of a face's features according to its localized appearance data. Instead of only looking at intensity data, the NDF predicts multiple patterns depending on the connections between surrounding areas. Despite a high identification rate, facial expression recognition by computers remains difficult. Based on geometry and appearance, two widely used approaches for automated FER systems are often used in the literature. Pre-processing, face detection, feature extraction, and expression classification are the four steps that typically make up facial expression recognition. Various deep learning techniques (convolutional neural networks) were used in this effort to recognize the primary seven human emotions: anger, fear, happiness, surprise, and neutrality