Pictures of emotionally neutral, positive, and negative (threat- or harm-related) scenes were presented for 3 seconds, paired with nonemotional control pictures. The eye fixations of high and low trait anxiety participants were monitored. Intensity of stimulus emotionality was varied, with two levels of perceptual salience for each picture (colour vs. greyscale). Regardless of perceptual salience, high anxiety was associated with preferential attention: (a) towards all types of emotional stimuli in initial orienting, as revealed by a higher probability of first fixation on the emotional picture than on the neutral picture of a pair; (b) towards positive and harm stimuli in a subsequent stage of early engagement, as shown by longer viewing times during the first 500 ms following onset of the pictures; and with (c) attention away from (i.e., avoidance) harm stimuli in a later phase, as indicated by shorter viewing times and lower frequency of fixation during the last 1000 ms of picture exposure. This suggests that the nature of the attentional bias varies as a function of the time course in the processing of emotional pictures.
We investigated the minimum expressive intensity that is required to recognize (above chance) static and dynamic facial expressions of happiness, sadness, anger, disgust, fear, and surprise. To this end, we varied the degree of intensity of emotional expressions unfolding from a neutral face, by means of graphics morphing software. The resulting face stimuli (photographs and short videos) were presented in an expression categorization task for 1 s each, and measures of sensitivity or discrimination (A') were collected to establish thresholds. A number of physical, perceptual, categorical, and affective controls were performed. All six basic emotions were reliably recognized above chance level from low intensities, although recognition thresholds varied for different expressions: 20% of intensity, for happiness; 40%, for sadness, surprise, anger, and disgust; and 50%, for fear. The advantage of happy faces may be due to their greater physical change in facial features (as shown by automated facial expression measurement), also at low levels of intensity, relative to neutral faces. Recognition thresholds and the pattern of confusions across expressions were, nevertheless, equivalent for dynamic and static expressions, although dynamic expressions were recognized more accurately and faster. (PsycINFO Database Record
We investigated the visual attention patterns (i.e., where, when, how frequently, and how long viewers look at each face region) for faces with (a) genuine, enjoyment smiles (i.e., a smiling mouth and happy eyes with the Duchenne marker), (b) fake, nonenjoyment smiles (a smiling mouth but nonhappy eyes: neutral, surprised, fearful, sad, disgusted, or angry), or (c) no smile (and nonhappy eyes). Viewers evaluated whether the faces conveyed happiness ("felt happy") or not, while eye movements were monitored. Results indicated, first, that the smiling mouth captured the first fixation more likely and faster than the eyes, regardless of type of eyes. This reveals similar attentional orienting to genuine and fake smiles. Second, the mouth and, especially, the eyes of faces with fake smiles received more fixations and longer dwell times than those of faces with genuine smiles. This reveals attentional engagement, with a processing cost for fake smiles. Finally, when the mouth of faces with fake smiles was fixated earlier than the eyes, the face was likely to be judged as genuinely happy. This suggests that the first fixation on the smiling mouth biases the viewer to misinterpret the emotional state underlying blended expressions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.