2011
DOI: 10.1111/j.2044-8295.2011.02056.x
|View full text |Cite
|
Sign up to set email alerts
|

Neural mechanisms of the automatic processing of emotional information from faces and voices

Abstract: Theoretical accounts suggest an increased and automatic neural processing of emotional, especially threat-related, facial expressions and emotional prosody. In line with this assumption, several functional imaging studies showed activation to threat-related faces and voices in subcortical and cortical brain areas during attentional distraction or unconscious stimulus processing. Furthermore, electrophysiological studies provided evidence for automatic early brain responses to emotional facial expressions and e… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

2
35
1

Year Published

2012
2012
2023
2023

Publication Types

Select...
7
1

Relationship

3
5

Authors

Journals

citations
Cited by 31 publications
(38 citation statements)
references
References 157 publications
(268 reference statements)
2
35
1
Order By: Relevance
“…The plot shows the difference of parameter estimates (mean and standard error) of the mean activation of the amygdala across experimental conditions. investigate the topography and time sequence of automatic brain responses to emotional expressions of faces and voices under attentional load, since several electrophysiological results suggest automatic responses to facial expressions even under high cognitive load (Straube et al, 2011b). Thus, findings from functional imaging studies and EEG studies might result in different observations.…”
Section: Discussionmentioning
confidence: 98%
See 1 more Smart Citation
“…The plot shows the difference of parameter estimates (mean and standard error) of the mean activation of the amygdala across experimental conditions. investigate the topography and time sequence of automatic brain responses to emotional expressions of faces and voices under attentional load, since several electrophysiological results suggest automatic responses to facial expressions even under high cognitive load (Straube et al, 2011b). Thus, findings from functional imaging studies and EEG studies might result in different observations.…”
Section: Discussionmentioning
confidence: 98%
“…Thus, findings from functional imaging studies and EEG studies might result in different observations. However, thus far, designs are only partially comparable and only research using the same designs, and preferentially a within subject paradigm can solve this issue (Straube et al, 2011b).…”
Section: Discussionmentioning
confidence: 99%
“…Some studies reported valence-specific effects, particularly threat-specific effects (e.g., Whalen et al, 1998; LeDoux, 2003; Gamer and Büchel, 2009; Inagaki et al, 2012; Furl et al, 2013; Sauer et al, 2014); whereas other studies showed modulations by emotion in general (Yang et al, 2002; Santos et al, 2011) or no emotional effects at all on amygdalar activation (e.g., Fitzgerald et al, 2006; Sato et al, 2010). These discrepancies may be related to multiple factors, such as attention (e.g., Pessoa et al, 2002; Straube et al, 2011a), face habituation (e.g., Breiter et al, 1996; Wright et al, 2001), ambiguity of facial expression (Adams et al, 2003), task condition (e.g., implicit or explicit; e.g., Critchley et al, 2000; Habel et al, 2007) and arousal differences between positive and negative expressions (e.g., Sauer et al, 2014). With regard to the last point, studies that vary arousal of facial expressions in a controlled way (i.e., using positive and negative expressions of matched arousal values) should be highly informative, similar to studies with other kinds of emotional stimuli mentioned above.…”
Section: Introductionmentioning
confidence: 99%
“…However, happy as compared to angry or fearful expressions are more common in everyday life and therefore, are often perceived less arousing and intense. In this case, the relevance or significance or faces may be reduced, resulting in altering the effects of amygdalar activations (Somerville and Whalen, 2006; Straube et al, 2011a). Therefore, it is still unclear whether these studies (N’Diaye et al, 2009; Mattavelli et al, 2014) used sufficiently strong expressions of happiness or not, with comparable arousal ratings between happy and threat-related expressions.…”
Section: Introductionmentioning
confidence: 99%
“…This paradigm is wellsuited for the current study's objectives as the emotional stimuli-faces and vignettes-embody the automatic versus effortful dichotomy described above. Faces capture attention more readily than other objects in one's environment (Ro, Russell, & Lavie, 2001) and information from emotional facial expressions is extracted even more rapidly and automatically (for detailed reviews, see Straube, Mothes-Lasch, & Miltner, 2011;Vuilleumier, 2002). Indeed, Vuilleumier (2002) concludes there is "remarkable convergence of behavioural and neurophysiological evidence suggesting that our brain is equipped with mechanisms for enhancing the detection of and reaction to emotional facial information" (p. 297).…”
mentioning
confidence: 93%