According to dominant theories of affect, humans innately and universally express a set of emotions using specific configurations of prototypical facial activity. Accordingly, thousands of studies have tested emotion recognition using sets of highly intense and stereotypical facial expressions, yet their incidence in real life is virtually unknown. In fact, a commonplace experience is that emotions are expressed in subtle and nonprototypical forms. Such facial expressions are at the focus of the current study. In Experiment 1, we present the development and validation of a novel stimulus set consisting of dynamic and subtle emotional facial displays conveyed without constraining expressers to using prototypical configurations. Although these subtle expressions were more challenging to recognize than prototypical dynamic expressions, they were still well recognized by human raters, and perhaps most importantly, they were rated as more ecological and naturalistic than the prototypical expressions. In Experiment 2, we examined the characteristics of subtle versus prototypical expressions by subjecting them to a software classifier, which used prototypical basic emotion criteria. Although the software was highly successful at classifying prototypical expressions, it performed very poorly at classifying the subtle expressions. Further validation was obtained from human expert face coders: Subtle stimuli did not contain many of the key facial movements present in prototypical expressions. Together, these findings suggest that emotions may be successfully conveyed to human viewers using subtle nonprototypical expressions. Although classic prototypical facial expressions are well recognized, they appear less naturalistic and may not capture the richness of everyday emotional communication. (PsycINFO Database Record
Facial expression recognition relies on the processing of diagnostic information from different facial regions. For example, successful recognition of anger versus disgust requires one to process information located in the eye/brow region, or in the mouth/nose region, respectively. Yet, how this information is extracted from the face is less clear. One widespread view, supported by cross-cultural experiments as well as neuropsychological case studies, is that the distribution of gaze fixations on specific diagnostic regions plays a critical role in the extraction of affective information. According to this view, emotion recognition is strongly related to the distribution of fixations to diagnostic regions. Alternatively, facial expression recognition may not rely merely on the exact patterns of fixations, but rather on other factors such as the processing of extrafoveal information. In the present study, we examined this matter by characterizing and using individual differences in fixation distributions during facial expression recognition. We revealed 4 groups of observers that differed in their distribution of fixations toward face regions in a robust and consistent manner. In line with previous studies, we found that different facial emotion categories evoked distinct distribution of fixations according to their diagnostic facial regions. However, individual distinctive patterns of fixations were not correlated with emotion recognition: individuals that strongly focused on the eyes, or on the mouth, achieved comparable emotion recognition accuracy. These findings suggest that extrafoveal processing may play a larger role in emotion recognition from faces than previously assumed. Consequently, successful emotion recognition can rise from diverse patterns of fixations.
Stability and change in early autism spectrum disorder risk were examined in a cohort of 99 preterm infants (⩽34 weeks of gestation) using the Autism Observation Scale for Infants at 8 and 12 months and the Autism Diagnostic Observation Schedule-Toddler Module at 18 months. A total of 21 infants were identified at risk by the Autism Observation Scale for Infants at 8 months, and 9 were identified at risk at 12 months, including 4 children who were not previously identified. At 18 months, eight children were identified at risk for autism spectrum disorder using the Autism Diagnostic Observation Schedule-Toddler Module, only half of whom had been identified using the original Autism Observation Scale for Infants cutoffs. Results are discussed in relation to early trajectories of autism spectrum disorder risk among preterm infants as well as identifying social-communication deficiencies associated with the early preterm behavioral phenotype.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.