A method for extracting information about facial expressions from images is presented. Facial expression images are coded using a multi-orientation, multi-resolution set of Gabor filters which are topographically ordered and aligned approximately with the face. The similarity space derived from this representation is compared with one derived from semantic ratings of the images by human observers. The results show that it is possible to construct a facial expression classifier with Gabor coding of the facial images as the input stage. The Gabor representation shows a significant degree of psychological plausibility, a design feature which may be important for human-computer interfaces.
Alexithymia is a personal trait characterized by a reduced ability to identify and describe one's own feelings and is known to contribute to a variety of physical and behavioural disorders. To elucidate the pathogenesis of stress-related disorders and the normal functions of emotion, it is important to investigate the neurobiology of alexithymia. Although several neurological models of alexithymia have been proposed, there is very little direct evidence for the neural correlates of alexithymia. Using PET, we studied brain activity in subjects with alexithymia when viewing a range of emotional face expressions. Twelve alexithymic and 12 non-alexithymic volunteers (all right-handed males) were selected from 247 applicants on the basis of the 20-item Toronto Alexithymia Scale (TAS-20). Regional cerebral blood flow (rCBF) was measured with H(2)(15)O-PET while the subjects looked at angry, sad and happy faces with varying emotional intensity, as well as neutral faces. Brain response in the subjects with alexithymia significantly differed from that in the subjects without alexithymia. The alexithymics exhibited lower rCBF in the inferior and middle frontal cortex, orbitofrontal cortex, inferior parietal cortex and occipital cortex in the right hemisphere than the non-alexithymics. Additionally, the alexithymics showed higher rCBF in the superior frontal cortex, inferior parietal cortex and cerebellum in the left hemisphere when compared with the non-alexithymics. A covariance analysis revealed that rCBF in the inferior and superior frontal cortex, orbitofrontal cortex and parietal cortex in the right hemisphere correlated negatively with individual TAS-20 scores when viewing angry and sad facial expressions, and that no rCBF correlated positively with TAS-20 scores. Moreover, the anterior cingulate cortex and insula were less activated in the alexithymics' response to angry faces than their response to neutral faces. These results suggest that people with alexithymia process facial expressions differently from people without alexithymia, and that this difference may account for the disorder of affect regulation and consequent peculiar behaviour in people with alexithymia.
Two experiments were conducted to investigate the role played by dynamic information in identifying facial expressions of emotion. Dynamic expression sequences were created by generating and displaying morph sequences which changed the face from neutral to a peak expression in different numbers of intervening intermediate stages, to create fast (6 frames), medium (26 frames), and slow (101 frames) sequences. In experiment 1, participants were asked to describe what the person shown in each sequence was feeling. Sadness was more accurately identified when slow sequences were shown. Happiness, and to some extent surprise, was better from faster sequences, while anger was most accurately detected from the sequences of medium pace. In experiment 2 we used an intensity-rating task and static images as well as dynamic ones to examine whether effects were due to total time of the displays or to the speed of sequence. Accuracies of expression judgments were derived from the rated intensities and the results were similar to those of experiment 1 for angry and sad expressions (surprised and happy were close to ceiling). Moreover, the effect of display time was found only for dynamic expressions and not for static ones, suggesting that it was speed, not time, which was responsible for these effects. These results suggest that representations of basic expressions of emotion encode information about dynamic as well as static properties.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.