The ability to recognize and label emotional facial expressions is an important aspect of social cognition. However, existing paradigms to examine this ability present only static facial expressions, suffer from ceiling effects or have limited or no norms. A computerized test, the Emotion Recognition Task (ERT), was developed to overcome these difficulties. In this study, we examined the effects of age, sex, and intellectual ability on emotion perception using the ERT. In this test, emotional facial expressions are presented as morphs gradually expressing one of the six basic emotions from neutral to four levels of intensity (40%, 60%, 80%, and 100%). The task was administered in 373 healthy participants aged 8-75. In children aged 8-17, only small developmental effects were found for the emotions anger and happiness, in contrast to adults who showed age-related decline on anger, fear, happiness, and sadness. Sex differences were present predominantly in the adult participants. IQ only minimally affected the perception of disgust in the children, while years of education were correlated with all emotions but surprise and disgust in the adult participants. A regression-based approach was adopted to present age- and education- or IQ-adjusted normative data for use in clinical practice. Previous studies using the ERT have demonstrated selective impairments on specific emotions in a variety of psychiatric, neurologic, or neurodegenerative patient groups, making the ERT a valuable addition to existing paradigms for the assessment of emotion perception.
The purpose of this explorative study was to examine vergence eye movements during fixations in reading. Eye movements of twelve normal adults were assessed during reading of different materials, that is, words within context (prose passages) and words without context (word lists), as well as during different tasks, that is, reading while attending to the meaning and reading while attending to the sound (words had to be pronounced subvocally). Results indicated that vergence velocity was higher during the reading of prose than during the reading of word lists as well as higher during reading for meaning than during reading while sub vocalizing. These findings were also true if only the initial 80 ms of each fixation were measured. Post-hoc analyses indicated that the effects of text type and reading objective were partially, but not entirely, attributable to differences in saccade sizes. Findings are taken to suggest that the increase in vergence velocity results from readers attending to larger units of the text.
Recently we found that adult children whose mothers had had a right-arm preference for holding infants have a reduced left bias for recognising faces, suggesting that they are less well right-hemisphere lateralised for perceiving faces. One possible explanation of this finding is that early visual exposure to faces is suboptimal for right-held infants. To test this idea, we asked mothers to pick up a doll with an inbuilt camera in its face and to start bottle-feeding it. The results showed that less was visible of the face of mothers who held the doll on their right arm in comparison to those who held the doll on their left arm: From the right arm, the mother's left half of the face was less visible when the mothers were looking up and their right half of the face was less visible when they were looking at the doll. These results suggest that right-held infants receive suboptimal information from faces. Because early face exposure is important for face-processing development, the suboptimal face exposure probably experienced by right-held infants may have consequences for their ability to recognise faces and facial emotion later in life.
Previous studies have shown deficits in the perception of static emotional facial expressions in individuals with autism spectrum disorders (ASD), but results are inconclusive. Possibly, using dynamic facial stimuli expressing emotions at different levels of intensities may produce more robust results, since these resemble the expression of emotions in daily life to a greater extent. 30 Young adolescents with high-functioning ASD (IQ>85) and 30 age- and intelligence-matched controls (ages between 12 and 15) performed the Emotion Recognition Task, in which morphs were presented on a computer screen, depicting facial expressions of the six basic emotions (happiness, disgust, fear, anger, surprise and sadness) at nine levels of emotional intensity (20–100%). The results showed no overall group difference on the ERT, apart from a slightly worse performance on the perception of the emotions fear (p<0.03) and disgust (p<0.05). No interaction was found between intensity level of the emotions and group. High-functioning individuals with ASD perform similar to matched controls on the perception of dynamic facial emotional expressions, even at low intensities of emotional expression. These findings are in agreement with other recent studies showing that emotion perception deficits in high-functioning ASD may be less pronounced than previously thought.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.