2008
DOI: 10.1017/s1355617708080740
|View full text |Cite
|
Sign up to set email alerts
|

Perception of affective prosody in major depression: A link to executive functions?

Abstract: Major depression is associated with impairments of executive functions and affect perception deficits, both being linked to dysfunction of fronto-subcortical networks. So far, little is known about the relationship between cognitive and affective deficits in major depression. In the present investigation, affect perception and executive functions were assessed in 29 patients with a diagnosis of major depression (Dep) and 29 healthy controls (HC). Both groups were comparable on IQ, age, and gender distribution.… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

2
42
1
3

Year Published

2009
2009
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 56 publications
(48 citation statements)
references
References 44 publications
2
42
1
3
Order By: Relevance
“…In a pattern similar to the results reported for identification of facial emotion, a bias toward interpreting neutral prosodic emotions (i.e., surprise) as negative, 152,205 and difficulty identifying both positively valenced 206,207 and negatively valenced 206,207 emotional tones (for contradictory findings, see Uekermann and colleagues 208 ) have been reported in homogeneous samples of acutely depressed patients. Similar findings of impaired recognition of positively 206 and of negatively valenced 206,209 tones have been reported in both actively ill and euthymic 209 patients with bipolar disorder (for contradictory findings, see Harmer 175 ).…”
supporting
confidence: 79%
“…In a pattern similar to the results reported for identification of facial emotion, a bias toward interpreting neutral prosodic emotions (i.e., surprise) as negative, 152,205 and difficulty identifying both positively valenced 206,207 and negatively valenced 206,207 emotional tones (for contradictory findings, see Uekermann and colleagues 208 ) have been reported in homogeneous samples of acutely depressed patients. Similar findings of impaired recognition of positively 206 and of negatively valenced 206,209 tones have been reported in both actively ill and euthymic 209 patients with bipolar disorder (for contradictory findings, see Harmer 175 ).…”
supporting
confidence: 79%
“…Therefore, the pattern of present results suggests that as the tasks become more difficult, the DEP group performs progressively worse than the ND group. This explanation dovetails with a recent study suggesting that prosody comprehension deficits in individuals with major depression may be influenced by executive functioning [41].…”
Section: Discussionsupporting
confidence: 83%
“…Punkanen et al (2011) reported that depressed subjects mostly confuse musical expressions of fear and sadness with anger, showing also that they were less accurate than the control group in decoding happiness and tenderness. Uekermann et al (2008) exploiting prosodic emotional stimuli found that depressed subjects show impairments for anger, happiness, fear, and neutral expressions (but not for sadness). Kan et al (2004) exploiting mute video and audio recordings found a poorer decoding accuracy of surprise for the audio stimuli.…”
Section: Resultsmentioning
confidence: 99%
“…This states that it is a universal and innate human ability to recognize the facial expressions corresponding to the six emotions called basic or primary (happiness, surprise, disgust, sadness, fear, and anger). The effectiveness of people with depression in decoding emotional expressions through photos was investigated with several methodologies, including the morphing task (Bediou et al, 2005;Joormann and Gotlib, 2006;Gilboa-Schechtman et al, 2008;LeMoult et al, 2009;Schaefer et al, 2010;Aldinger et al, 2013), the emotion recognition task (Kan et al, 2004;Leppänen et al, 2004;Gollan et al, 2008Gollan et al, , 2010Uekermann et al, 2008;Wright et al, 2009;Douglas and Porter, 2010;Milders et al, 2010;Naranjo et al, 2011;Punkanen et al, 2011;Péron et al, 2011;Watters and Williams, 2011;Schneider et al, 2012;Schlipf et al, 2013;Chen et al, 2014), the emotion attentional task (Gotlib et al, 2004;Joormann and Gotlib, 2007;Leyman et al, 2007;Kellough et al, 2008;Sanchez et al, 2013;Duque and Vázquez, 2014), the matching task (Milders et al, 2010;Liu et al, 2012;Chen et al, 2014), and the dot-probe detection task (Fritzsche et al, 2010).…”
mentioning
confidence: 99%
See 1 more Smart Citation