Humans have the ability to use a complex code of non-verbal behavior to communicate their internal states to others. Conversely, the understanding of intentions and emotions of others is a fundamental aspect of human social interaction. In the study presented here we investigate how people perceive the expression of emotional states based on the observation of different styles of locomotion. Our goal is to find a small set of canonical parameters that allow to control a wide range of emotional expressions. We generated different classes of walking behavior by varying the head/torso inclination, the walking speed, and the viewing angle of an animation of a virtual character. 18 subjects rated the observed walking person using the two-dimensional circumplex model of arousal and valence. The results show that, independent of the viewing angle, participants perceived distinct states of arousal and valence. Moreover, we could show that parametrized body posture codes emotional states, irrespective of the contextual influence or facial expressions. These findings suggest that human locomotion transmits basic emotional cues that can be directly related to canonical parameters of different dimensions of the expressive behavior. These findings are important as they allow us to build virtual characters whose emotional expression is recognizable at large distance and during extended periods of time.
SummaryThis text describes the data from an initial set of navigation experiments in the scope of the Bio-ICT European project NEUROCHEM. The acquisition system was composed of two segments, a robotic platform developed in SPECS at UPF and an embedded computer running a custom GNU/Linux distribution developed within the project by UPC (Fig. 1). The embedded computer held a Metal Oxide gas sensor array (TGS262010, TGS260010 and TGS2810 varieties) with a total autonomy of 1.5 hours. The system was placed in a wind tunnel facility in UPF in order to characterize the response of the metal oxide sensor array under the presence of one odour source jointly with a strong background. The compounds used were Ethanol (as background), Acetone and Ammonia at 5%, 11% and 20% dilution in water, respectively. These compounds were diffused in the wind tunnel with help of an ultrasound diffuser at two separate locations (Fig. 2). Four series of measurements were performed aiming to explore the capabilities of the sensor array in constructing the odour map (source 1) in presence of a strong background (source 2) under a controlled environment. Data pre-processing included correction of a certain time delay in sensor response in respect to the robot position in the tunnel and de-noising through a low-pass filter. A separation method based on Independent Component Analysis (ICA) was applied to the sensor data in order to decorrelate the signal form the two sources. ICA assumes a model of mixing,were the sources s=[s1, s2, …, sm]' are mutually independent random variables, and A is an unknown invertible mixing matrix. This algorithm finds a matrix W such that the output y = Wx (2) is a good estimate of the sources s. Pearson variant of ICA finds y through a Mutual Information minimization process [1]. Results of Pearson ICA are able to decorrelate the two odour sources as seen in Figure 3 and 4 for Ammonia and Ethanol, as seen split from fist and second ICA component, and in Figure 5 and 6 for Acetone and Ethanol. Ethanol can be considered as a very strong background as metal oxide sensors are very sensitive to this compound. This poster shows that a preprocessing based on Independent Component Analysis is able to discriminate two odour sources. Further work will include automatic determination of the number of components present in the tunnel and the application of the Neurochem platform in surge-and-cast behavioral models. References[1] Karvanen and Koivunen, Blind separation methods based on Pearson system and its extensions, Signal Processing 82 (2002) 663-673
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.