2019
DOI: 10.1016/j.brainres.2019.146343
|View full text |Cite
|
Sign up to set email alerts
|

From eye to face: The impact of face outline, feature number, and feature saliency on the early neural response to faces

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(7 citation statements)
references
References 62 publications
0
7
0
Order By: Relevance
“…Baron-Cohen (1994, 2005 hypothesized an eye-direction detector, an intentionality detector, the emotion detector, a shared-attention mechanism, the empathizing system, and a theory-of-mind mechanism. The eyedirection detector has subsequently found support from the ERP research reviewed earlier, which suggests a specific eye-region detector (e.g., Engell & McCarthy, 2014;Parkington & Itier, 2019). Based partly on the findings of complex response profiles in monkey superior temporal sulcus (STS) to social-attention signals, Perrett and Emery (1994) theorized a direction-ofattention detector and a mutual-attention mechanism additional to those in Baron-Cohen's model.…”
Section: Current Theories and Models Of Shared Attentionmentioning
confidence: 96%
See 1 more Smart Citation
“…Baron-Cohen (1994, 2005 hypothesized an eye-direction detector, an intentionality detector, the emotion detector, a shared-attention mechanism, the empathizing system, and a theory-of-mind mechanism. The eyedirection detector has subsequently found support from the ERP research reviewed earlier, which suggests a specific eye-region detector (e.g., Engell & McCarthy, 2014;Parkington & Itier, 2019). Based partly on the findings of complex response profiles in monkey superior temporal sulcus (STS) to social-attention signals, Perrett and Emery (1994) theorized a direction-ofattention detector and a mutual-attention mechanism additional to those in Baron-Cohen's model.…”
Section: Current Theories and Models Of Shared Attentionmentioning
confidence: 96%
“…Indeed, it has long been known that observing eyes alone elicits an N170 greater in amplitude than the onset of a whole face ( Bentin, Allison, Puce, Perez, & McCarthy, 1996 ). More recently, gaze-contingent studies have demonstrated that the sensitivity of the N170 to gaze stimuli can be detected when participants fixate on the eyes of a face, not just when eye regions are presented in isolation ( Itier & Preston, 2018 ; Nemrodov, Anderson, Preston, & Itier, 2014 ; Parkington & Itier, 2018 , 2019 ). In children, the N170 has been shown to be larger for eyes than for faces and to reflect an adult-like event-related potential (ERP) profile by the age of 11, suggesting the N170 may be driven largely by eyes and mature earlier than the N170 for complete faces, which continues into adulthood ( Taylor, Edmonds, McCarthy, & Allison, 2001 ).…”
Section: Early Event-related Potentials: the N170 Edan And N2pcmentioning
confidence: 99%
“…The evidence supporting above theoretical speculation comes from research on early processing of faces. For example, the visual search task found that the eyes had a very high priority in visual search 16,47 ; eye movement studies found that when people looked at faces, the probability of the first fixation on the eye area (including the eyes themselves and surrounding areas) was extremely high 15,48 , and 1-2 gaze points could allow participants to recognize faces quite accurately 15 ; ERP studies have found that the amplitude of N170 (which is considered to be the specific component for holistic face processing) induced by eyes was much higher than other features on the face 19,20,[49][50][51] . These results suggest that the human cognitive nervous system may have an "eye detector" that allows people to accurately detect the eyes at the very early stage of perception, completing the first stage of face perception.…”
Section: Discussionmentioning
confidence: 99%
“…For instance, Hsiao and colleagues used the eye-tracking technique and observed that the number of fixations on eyes was significantly higher than that on mouth, showing a bias fixating on the upper half 15 ; Burton and colleagues used the visual search paradigm and found that participants were faster and more accurate in searching for an upper facial half than a lower half in a complex visual background 16 ; Wang and colleagues using the face matching task found that participants were more sensitive to changes in the eye region relative to the mouth region 10 . In addition, some studies found that N170 amplitudes were stronger when participants fixed on eyes relative to other regions 19 , 20 . Considering that the N170 is regarded as the reflection of ERP index of holistic face processing 21 , 22 , this may suggest that the processing of the upper half of the face (than the lower half of the face) is more involved in holistic face processing.…”
Section: Introductionmentioning
confidence: 99%
“…Besides these studies, a series of studies investigated what effects of different face features on ERP responses (Neath & Itier, 2015; Neath-Tavares & Itier, 2016; Nemrodov et al, 2014; Parkington & Itier, 2018, 2019; see also Aguado et al, 2019; de Lissa et al, 2014; Towler & Eimer, 2015). In particular, Neath and Itier (2015) and Neath-Tavares and Itier (2016) demonstrated emotional modulation of the P1 and N170 components across changes in spatial locations.…”
mentioning
confidence: 99%