As humans, we experience social stress in countless everyday-life situations. Giving a speech in front of an audience, passing a job interview, and similar experiences all lead us to go through stress states that impact both our psychological and physiological states. Therefore, studying the link between stress and physiological responses had become a critical societal issue, and recently, research in this field has grown in popularity. However, publicly available datasets have limitations. In this article, we propose a new dataset, UBFC-Phys, collected with and without contact from participants living social stress situations. A wristband was used to measure contact blood volume pulse (BVP) and electrodermal activity (EDA) signals.Video recordings allowed to compute remote pulse signals, using remote photoplethysmography (RPPG), and facial expression features. Pulse rate variability (PRV) was extracted from BVP and RPPG signals. Our dataset permits to evaluate the possibility of using video-based physiological measures compared to more conventional contact-based modalities. The goal of this article is to present both the dataset, which we make publicly available, and experimental results of contact and non-contact data comparison, as well as stress recognition. We obtained a stress state recognition accuracy of 85.48%, achieved by remote PRV features.
Machine learning has known a tremendous growth within the last years, and lately, thanks to that, some computer vision algorithms started to access what is difficult or even impossible to perceive by the human eye. It is then natural that scientists began looking for ways to probe humans' emotions and their psyche with this technology. In this paper, we study the feasibility of recognizing and classifying the abstract concept of emotional states from videos of people facing a regular RGB camera. We do so by using the barely perceptible micro facial expressions humans cannot control, as well as the spontaneous variations of the pulse rate that we estimated using remote photoplethysmography. We compare these two modalities and our experimental results show that it is possible to classify emotional states from these implicit information gathered from regular cameras with encouraging performances.
Remote Photoplethysmography (rPPG) enables quantifying blood volume variations in the skin tissues from an input video recording, using a regular RGB camera. Obtained pulse signals often contain noisy portions due to motion, leading researchers to put aside a great number of rPPG signals in their studies. In this paper, an approach using a Gated Recurrent Unit-based neural network model in order to identify reliable portions in rPPG signals is proposed. This is done by classifying rPPG signal samples into reliable and unreliable samples. For this purpose, rPPG and electrocardiography signals (ECG) were collected from 11 participants, rPPG signal samples were labeled (ECG was used as ground truth), and data were augmented to reach a total number of 11000 1-minute-long rPPG signals. We developed a model composed of a unidimensional CNN and a Bidirectional GRU (1D-CNN+B-GRU) for this study, and obtained an accuracy rate of 85.88%.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.