2015
DOI: 10.1016/j.biopsycho.2014.11.012
|View full text |Cite
|
Sign up to set email alerts
|

ERP signs of categorical and supra-categorical processing of visual information

Abstract: Overall, the present ERP results revealed shared and distinct mechanisms of access to supra-categorical and categorical knowledge in the same way in which shared and distinct neural representations underlie the processing of diverse semantic categories. Additionally, they outlined the serial nature of categorical and supra-categorical representations, indicating the sequential steps of access to these separate knowledge types.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

3
12
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 18 publications
(15 citation statements)
references
References 92 publications
3
12
0
Order By: Relevance
“…Specifically, monolinguals exhibited less negativity in the N400 for the semantically-related condition than in the unrelated condition while bilinguals did not. Attenuation of the N400 in response to a semantic relationship is consistent with results of monolingual studies using pictures (e.g., Chauncey, Holcomb, & Grainger, 2009; McPherson & Holcomb, 1999; Zani et al, 2015), but few studies have shown N400 attenuation coupled with longer RTs. Blackford et al (2012) suggested that the N400 indexes how the semantic relationship is perceived and integrated (i.e., automatic electrophysiological semantic priming) but does not directly reflect the factors involved in later response selection.…”
Section: Studysupporting
confidence: 79%
See 1 more Smart Citation
“…Specifically, monolinguals exhibited less negativity in the N400 for the semantically-related condition than in the unrelated condition while bilinguals did not. Attenuation of the N400 in response to a semantic relationship is consistent with results of monolingual studies using pictures (e.g., Chauncey, Holcomb, & Grainger, 2009; McPherson & Holcomb, 1999; Zani et al, 2015), but few studies have shown N400 attenuation coupled with longer RTs. Blackford et al (2012) suggested that the N400 indexes how the semantic relationship is perceived and integrated (i.e., automatic electrophysiological semantic priming) but does not directly reflect the factors involved in later response selection.…”
Section: Studysupporting
confidence: 79%
“…This component is sensitive to semantic and lexical mismatches between the stimulus and expectations such that mismatches are associated with larger negative amplitudes than matches (Kutas & Federmeier, 2011). In paradigms in which two semantically related pictures are presented either sequentially (Holcomb & McPherson, 1994; McPherson & Holcomb, 1999) or simultaneously (Zani et al, 2015), relatedness has resulted in less N400 negativity than found on unrelated pairs. This attenuation of the N400 for related primes has been interpreted as semantic integration (Holcomb & McPherson, 1994; Kutas & Federmeier, 2011).…”
Section: Introductionmentioning
confidence: 99%
“…Furthermore, in this range the eventrelated activity is influenced by factors other than the physical characteristics of the visual events and spatial attention effects. These factors involve the intention to discriminate (e.g., Hopf, Vogel, Woodman, Heinze, & Luck, 2002) and other task-related effects (Zani et al, 2015). Such studies have demonstrated the effects of different forms of attention-in other words, top-down influences.…”
Section: Discussionmentioning
confidence: 99%
“…How people perceive, judge, and interact with others is strongly influenced by what they know about them. Even abstract and verbally transmitted information concerning good or bad social behavior can affect how they judge others (Bliss-Moreau, Barrett, & Wright, 2008;Goodwin, Piazza, & Rozin, 2014), how they perceive others' faces or facial expressions (Abdel Rahman, 2011;Luo, Wang, Dzhelyova, Huang, & Mo, 2016;Suess, Rabovsky, & Abdel Rahman, 2015;Wieser et al, 2014;Xu, Li, Diao, Fan, & Yang, 2016), and may even affect whether they see others' faces in the first place (Anderson, Siegel, Bliss-Moreau, & Barrett, 2011; but see Rabovsky, Stein, & Abdel Rahman, 2016;Stein, Grubb, Bertrand, Suh, & Verosky, 2017). Here we consider one factor that may influence the potency of social-emotional information to modulate person evaluations: the verbally marked trustworthiness of the information.…”
Section: Praised Be Doubt! I Advise You To Greetmentioning
confidence: 99%
“…However, we also analyzed the time window from 300 to 350 ms since the EPN has been found slightly later for newly learned faces (up to 350 ms, cf. Luo et al, 2016;Suess et al, 2015;Xu et al, 2016; see Figure 1 in Supplemental Material).…”
Section: Person Judgmentmentioning
confidence: 99%