2017
DOI: 10.3389/fnhum.2017.00018
|View full text |Cite
|
Sign up to set email alerts
|

Facial Expression Related vMMN: Disentangling Emotional from Neutral Change Detection

Abstract: Detection of changes in facial emotional expressions is crucial to communicate and to rapidly and automatically process possible threats in the environment. Recent studies suggest that expression-related visual mismatch negativity (vMMN) reflects automatic processing of emotional changes. In the present study we used a controlled paradigm to investigate the specificity of emotional change-detection. In order to disentangle specific responses to emotional deviants from that of neutral deviants, we presented neu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

16
64
2
2

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 44 publications
(84 citation statements)
references
References 64 publications
16
64
2
2
Order By: Relevance
“…The mismatch response observed in our early window peaked after 200 ms. Earlier vMMN signals have been reported in the 100-200 ms range (Kovarski et al, 2017;Pazo-Alvarez et al, 2003), although these are not universal. In the current context, several causes could account for the relatively late emergence of the first signal of pattern deviance.…”
Section: A Late ''Early'' Mismatch Responsementioning
confidence: 94%
See 1 more Smart Citation
“…The mismatch response observed in our early window peaked after 200 ms. Earlier vMMN signals have been reported in the 100-200 ms range (Kovarski et al, 2017;Pazo-Alvarez et al, 2003), although these are not universal. In the current context, several causes could account for the relatively late emergence of the first signal of pattern deviance.…”
Section: A Late ''Early'' Mismatch Responsementioning
confidence: 94%
“…Visual mismatch signals have also been studied in the context of face perception. An increasing number of studies now describe a mismatch signal for face emotional expressions (Susac, Ilmoniemi, Pihko, & Supek, 2004;Zhao & Li, 2006;Stefanics, Csukly, Komlósi, Czobor, & Czigler, 2012;Astikainen, Cong, Ristaniemi, & Hietanen, 2013;Vogel, Shen, & Neuhaus, 2015;Kovarski et al, 2017). Similarly, face gender regularities have also be shown to be tracked by the visual system Wang et al, 2016).…”
Section: Mismatch Responses In the Visual Hierarchymentioning
confidence: 99%
“…Minimum a priori sample size was set at fifteen participants, based on sample sizes typical for many recent vMMN studies using similar designs (Durant, Sulykos, & Czigler, 2017;Kovarski et al, 2017). ERP data was recorded from 20 (16 female, mean age -22.7 years) neurologically typical students who participated as volunteers (5 additional participants were recorded in the case of potential technical problems).…”
Section: Participantsmentioning
confidence: 99%
“…There are known effects of social relevance on mismatch responses in the visual and auditory modalities, notably when manipulating the communicative nature of the signals: in sequences of emotional face stimuli, Campanella and colleagues (2002) found earlier and larger mismatch responses to changes of expressions that led to a different emotional appraisal (e.g. a happy face in a sequence of sad faces) than to a different depiction of the same emotion (see also Bayer et al, 2017;Kovarski et al, 2017). In the auditory domain, affiliative signals such as laughter evoke larger MMN than a non-affiliative growl (e.g.…”
Section: Introductionmentioning
confidence: 99%