Abstract-Automatic recognition of spontaneous facial expressions is a major challenge in the field of affective computing. Head rotation, face pose, illumination variation, occlusion etc. are the attributes that increase the complexity of recognition of spontaneous expressions in practical applications. Effective recognition of expressions depends significantly on the quality of the database used. Most well-known facial expression databases consist of posed expressions. However, currently there is a huge demand for spontaneous expression databases for the pragmatic implementation of the facial expression recognition algorithms. In this paper, we propose and establish a new facial expression database containing spontaneous expressions of both male and female participants of Indian origin. The database consists of 428 segmented video clips of the spontaneous facial expressions of 50 participants. In our experiment, emotions were induced among the participants by using emotional videos and simultaneously their self-ratings were collected for each experienced emotion. Facial expression clips were annotated carefully by four trained decoders, which were further validated by the nature of stimuli used and self-report of emotions. An extensive analysis was carried out on the database using several machine learning algorithms and the results are provided for future reference. Such a spontaneous database will help in the development and validation of algorithms for recognition of spontaneous expressions.
This paper proposes a scheme for assessing the alertness levels of an individual using simultaneous acquisition of multimodal physiological signals and fusing the information into a single metric for quantification of alertness. The system takes electroencephalogram, high-speed image sequence, and speech data as inputs. Certain parameters are computed from each of these measures as indicators of alertness and a metric is proposed using a fusion of the parameters for indicating alertness level of an individual at an instant. The scheme has been validated experimentally using standard neuropsychological tests, such as the Visual Response Test (VRT), Auditory Response Test (ART), a Letter Counting (LC) task, and the Stroop Test. The tests are used both as cognitive tasks to induce mental fatigue as well as tools to gauge the present degree of alertness of the subject. Correlation between the measures has been studied and the experimental variables have been statistically analyzed using measures such as multivariate linear regression and analysis of variance. Correspondence of trends obtained from biomarkers and neuropsychological measures validate the usability of the proposed metric.
Inability to efficiently deal with emotionally laden situations, often leads to poor interpersonal interactions. This adversely affects the individual’s psychological functioning. A higher trait emotional intelligence (EI) is not only associated with psychological wellbeing, educational attainment, and job-related success, but also with willingness to seek professional and non-professional help for personal-emotional problems, depression and suicidal ideation. Thus, it is important to identify low (EI) individuals who are more prone to mental health problems than their high EI counterparts, and give them the appropriate EI training, which will aid in preventing the onset of various mood related disorders. Since people may be unaware of their level of EI/emotional skills or may tend to fake responses in self-report questionnaires in high stake situations, a system that assesses EI using physiological measures can prove affective. We present a multimodal method for detecting the level of trait Emotional intelligence using non-contact based autonomic sensors. To our knowledge, this is the first work to predict emotional intelligence level from physiological/autonomic (cardiac and respiratory) response patterns to emotions. Trait EI of 50 users was measured using Schutte Self Report Emotional Intelligence Test (SSEIT) along with their cardiovascular and respiratory data, which was recorded using FMCW radar sensor both at baseline and while viewing affective movie clips. We first examine relationships between users’ Trait EI scores and autonomic response and reactivity to the clips. Our analysis suggests a significant relationship between EI and autonomic response and reactivity. We finally attempt binary EI level detection using linear SVM. We also attempt to classify each sub factor of EI, namely–perception of emotion, managing own emotions, managing other’s emotions, and utilization of emotions. The proposed method achieves an EI classification accuracy of 84%, while accuracies ranging from 58 to 76% is achieved for recognition of the sub factors. This is the first step towards identifying EI of an individual purely through physiological responses. Limitation and future directions are discussed.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.