Face recognition depends critically on horizontal orientations (Goffaux & Dakin, 2010). Face images that lack horizontal features are harder to recognize than those that have that information preserved. Presently, we asked if facial emotional recognition also exhibits this dependency by asking observers to categorize orientation-filtered happy and sad expressions. Furthermore, we aimed to dissociate image-based orientation energy from object-based orientation by rotating images 90-degrees in the picture-plane. In our first experiment, we showed that the perception of emotional expression does depend on horizontal orientations and that object-based orientation constrained performance more than image-based orientation. In Experiment 2 we showed that mouth openness (i.e. open versus closed-mouths) also influenced the emotion-dependent reliance on horizontal information. Lastly, we describe a simple computational analysis that demonstrates that the impact of mouth openness was not predicted by variation in the distribution of orientation energy across horizontal and vertical orientation bands. Overall, our results suggest that emotion recognition does largely depend on horizontal information defined relative to the face, but that this bias is modulated by multiple factors that introduce variation in appearance across and within distinct emotions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.