Abstract-Face emotions analysis is one of the fundamental techniques that might be exploited in a natural human-computer interaction process and thus is one of the most studied topics in current computer vision literature. In consequence face features extraction is an indispensable element of the face emotion analysis as it influences decision making performance. The paper concentrates on classification of human poses based on mouth. Mouth features extraction, which next to eye region features becomes one of the most representative face regions in the context of emotions retrieval. Additionally, in the paper, original mouth features extraction method was presented. It is gradient based. Evaluation of the method was performed for a subset of the Yale images database and classification accuracy for single emotion is over 70%.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.