Emotions play a crucial role in person to person interaction. In recent years, there has been a growing interest in improving all aspects of interaction between humans and computers. The ability to understand human emotions is desirable for the computer in several applications especially by observing facial expressions. This paper explores a ways of humancomputer interaction that enable the computer to be more aware of the user's emotional expressions we present a approach for the emotion recognition from a facial expression, hand and body posture. Our model uses multimodal emotion recognition system in which we use two different models for facial expression recognition and for hand and body posture recognition and then combining the result of both classifiers using a third classifier which give the resulting emotion. Multimodal system gives more accurate result than a signal or bimodal system
Gestures are a major form of human communication. Hence gestures are found to be an appealing way to interact with computers, as they are already a natural part of how we communicate. A primary goal of gesture recognition is to create a system which can identify specific human gestures and use them to convey information for device control and by implementing real time gesture recognition a user can control a computer by doing a specific gesture in front of a video camera linked to the computer. A primary goal of gesture recognition research is to create a system which can identify specific human gestures and use them to convey information or for device control. This project covers various issues like what are gesture, their classification, their role in implementing a gesture recognition system, system architecture concepts for implementing a gesture recognition system, major issues involved in implementing a simplified gesture recognition system, exploitation of gestures in experimental systems, importance of gesture recognition system, real time applications and future scope of gesture recognition system.The algorithm used in this project are Finger counting algorithm,X-Y axis(To recognize the thumb).
Non-verbal communication may be used to enhance verbal communication or even provide developers with an alternative for communicating information. Emotion or Gesture recognition is been highlighted in the area of Artificial Intelligence and advanced machine learning. Emotion or gesture is an important feature for an intelligent Human Computer Interaction. This paper basically is a literature survey paper which reveals with the research work already dealt with in this area. Facial expression has been concluded as the most important part involved in it. Even Facial features are also distinguished out of which eyes and mouth is probably more prominent. Neural networks are the widely used. Approaches towards Rough Fuzzy definition can be probably resolve the complexity. Context based recognition can be added so as to resolve the ambiguity involved in different scenarios.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.