2005
DOI: 10.1111/j.0956-7976.2005.00801.x
|View full text |Cite
|
Sign up to set email alerts
|

Transmitting and Decoding Facial Expressions

Abstract: This article examines the human face as a transmitter of expression signals and the brain as a decoder of these expression signals. If the face has evolved to optimize transmission of such signals, the basic facial expressions should have minimal overlap in their information. If the brain has evolved to optimize categorization of expressions, it should be efficient with the information available from the transmitter for the task. In this article, we characterize the information underlying the recognition of th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

80
679
7
12

Year Published

2006
2006
2019
2019

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 646 publications
(802 citation statements)
references
References 20 publications
80
679
7
12
Order By: Relevance
“…This information was used even though the mouth was not particularly helpful for discriminating between these two particular identities (confirmed by the saliency model). Such a bias to draw information from the mouth is very much in line with other published work on information use during face judgments, which is suggested to be broadly optimized to support face expertise (Smith, Cottrell, Gosselin, & Schyns, 2005). We speculate that across their extensive face experience accumulated from infancy, typical individuals might develop a “default” face processing strategy, which is then flexibly adapted to match stimulus characteristics and task demands but was not wholly recalibrated in the 216 trials of the current study.…”
Section: Discussionsupporting
confidence: 82%
“…This information was used even though the mouth was not particularly helpful for discriminating between these two particular identities (confirmed by the saliency model). Such a bias to draw information from the mouth is very much in line with other published work on information use during face judgments, which is suggested to be broadly optimized to support face expertise (Smith, Cottrell, Gosselin, & Schyns, 2005). We speculate that across their extensive face experience accumulated from infancy, typical individuals might develop a “default” face processing strategy, which is then flexibly adapted to match stimulus characteristics and task demands but was not wholly recalibrated in the 216 trials of the current study.…”
Section: Discussionsupporting
confidence: 82%
“…R 2 values indicated on each graph scan paths of the eyes and mouth may be unrelated to autism diagnosis, but instead are determined (in part) by the degree of co-morbid alexithymia in the sample of individuals with autism. Second, many studies have demonstrated the importance of typical scan paths in recognising facial emotion (Aviezer et al 2008;Calder et al 2000;Smith et al 2005;Wong et al 2005), therefore if alexithymia is associated with atypical scan paths to eyes and mouth, then the inconsistent findings with respect to recognition of emotional facial expression in autism (see Bal et al 2010;Jemel et al 2006) may also be explained by varying degrees of co-morbid alexithymia in the sample of individuals with autism across studies. We suggest therefore, that future studies of emotion processing in individuals with autism obtain measures of alexithymia in order to determine whether any impairments seen are due to autism, alexithymia, or the combination of these two factors.…”
Section: Discussionmentioning
confidence: 99%
“…More generally, many studies show that in order to accurately categorize observed emotional expressions one must appropriately scan eye and mouth features, as different emotional expressions are most reliably signalled by different parts of the face (Aviezer et al 2008;Calder et al 2000;Smith et al 2005;Wong et al 2005).…”
mentioning
confidence: 99%
See 2 more Smart Citations