2010
DOI: 10.1523/jneurosci.4863-09.2010
|View full text |Cite
|
Sign up to set email alerts
|

Internal and External Features of the Face Are Represented Holistically in Face-Selective Regions of Visual Cortex

Abstract: The perception and recognition of familiar faces depends critically on an analysis of the internal features of the face (eyes, nose, mouth). We therefore contrasted how information about the internal and external (hair, chin, face outline) features of familiar and unfamiliar faces is represented in face-selective regions. There was a significant response to both the internal and external features of the face when presented in isolation. However, the response to the internal features was greater than the respon… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

14
104
0

Year Published

2011
2011
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 135 publications
(118 citation statements)
references
References 35 publications
14
104
0
Order By: Relevance
“…Similarly, there are effects involving the face-sensitive N170 ERP (Jacques & Rossion, 2009;Kuefner et al, 2010), and the steady-state visual evoked potential has obvious promise (Norcia et al, 2015;Alp et al, 2016;Rossion, 2017). Whilst these neurophysiological findings are at present based mainly on studies involving unfamiliar faces (but see Andrews et al, 2010) they converge with our demonstration from familiar faces of a strong underlying perceptual basis for the composite face effect. …”
Section: Discussionsupporting
confidence: 85%
See 1 more Smart Citation
“…Similarly, there are effects involving the face-sensitive N170 ERP (Jacques & Rossion, 2009;Kuefner et al, 2010), and the steady-state visual evoked potential has obvious promise (Norcia et al, 2015;Alp et al, 2016;Rossion, 2017). Whilst these neurophysiological findings are at present based mainly on studies involving unfamiliar faces (but see Andrews et al, 2010) they converge with our demonstration from familiar faces of a strong underlying perceptual basis for the composite face effect. …”
Section: Discussionsupporting
confidence: 85%
“…For example fMRI has shown release from adaptation based on the composite face illusion in visual areas selective to faces in the ventral stream (Schiltz & Rossion, 2006;Andrews et al, 2010;Schiltz et al, 2010). Since these effects arise even with completely incidental tasks (such as detecting a red dot superimposed on some stimuli) in regions considered to be primarily 'visual' in nature, it seems unlikely that decision processes play a significant role.…”
Section: Discussionmentioning
confidence: 99%
“…Sex categorization times on correct trials did not differ across the three types of dyads [F(2,22) = .24, p = .79]. Rossion et al, 2012;Weiner & Grill-Spector, 2010) and replicate reports according to which face-selective activity is right-lateralized and elicited more consistently in the FFA than the OFA (e.g., Andrews, Davies-Thompson, Kingstone, & Young, 2010;Engell & McCarthy, 2013;. Mean parameter estimates in all five ROIs were extracted from the main experiment for each participant (see Figure 3).…”
Section: Behavioral Analysissupporting
confidence: 65%
“…Specifically, the OFA and FFA, which are primarily involved in recognition of individual identity (Grill-Spector et al, 2004), are sensitive to different aspects of faces (Liu et al, 2010). The OFA is sensitive to the presence of face parts (Pitcher et al, 2007;Harris and Aguirre, 2008;Andrews et al, 2010), whereas the FFA is preferentially involved in analyzing the configuration among them (Barton et al, 2002;Yovel and Kanwisher, 2004;Schiltz and Rossion, 2006;Rotshtein et al, 2007;Schiltz et al, 2010). Further, typical face processing requires the interaction of the OFA and FFA, because the FFA, for example, can be preserved in an individual with a lesion to the OFA and suffering from prosopagnosia (i.e., severe deficits in face recognition) (Rossion et al, 2003;Steeves et al, 2006).…”
Section: Discussionmentioning
confidence: 99%