2018
DOI: 10.1038/s41598-018-32868-3
|View full text |Cite
|
Sign up to set email alerts
|

Categorical emotion recognition from voice improves during childhood and adolescence

Abstract: Converging evidence demonstrates that emotion processing from facial expressions continues to improve throughout childhood and part of adolescence. Here we investigated whether this is also the case for emotions conveyed by non-linguistic vocal expressions, another key aspect of social interactions. We tested 225 children and adolescents (age 5–17) and 30 adults in a forced-choice labeling task using vocal bursts expressing four basic emotions (anger, fear, happiness and sadness). Mixed-model logistic regressi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

8
33
2

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 43 publications
(46 citation statements)
references
References 84 publications
8
33
2
Order By: Relevance
“…around the age of 14 to 15 years (Grosbras, Ross, & Belin, 2018), the current study and previous work have noted continued improvement beyond age 15 for the recognition of vocal affect in speech (Chronaki et al, 2018;Morningstar et al, 2018a). Thus, it is possible that agerelated changes in the neural representation of these stimuli also may differ from those noted in response to speech-based vocal emotion.…”
Section: Figsupporting
confidence: 69%
“…around the age of 14 to 15 years (Grosbras, Ross, & Belin, 2018), the current study and previous work have noted continued improvement beyond age 15 for the recognition of vocal affect in speech (Chronaki et al, 2018;Morningstar et al, 2018a). Thus, it is possible that agerelated changes in the neural representation of these stimuli also may differ from those noted in response to speech-based vocal emotion.…”
Section: Figsupporting
confidence: 69%
“…Emotion recognition also has a complex developmental trajectory, with studies suggesting that recognition of facial emotion is not adult-like until approximately 11 years of age (Tonks et al, 2007;Gao and Maurer, 2010;Chronaki et al, 2015), bodily emotions at approximately 8 years of age (Boone and Cunningham, 1998;Lagerlof and Djerf, 2009;Ross et al, 2012) and vocal emotion recognition ability still developing into adolescence (Chronaki et al, 2015;Grosbras et al, 2018). Might these differing trajectories be a product of differential development in simulation abilities?…”
Section: The Origin Of Emotional Understandingmentioning
confidence: 99%
“…However, Hatfield et al (1993) argue that contagion is a higher-level cognitive phenomenon that can involve not only the synchronizations of facial expressions, but as mentioned above, also vocalizations, postures and the movements of another person. We already know that the body (Atkinson et al, 2004;de Gelder et al, 2010;de Gelder and Van den Stock, 2011) and the voice (Belin et al, 2008;Grosbras et al, 2018) are important expressive channels for providing cues to another's emotional state, but to date these cues have been largely omitted in sensorimotor simulation models.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…It provides us with information about other people's internal emotional states and helps us to interpret and predict their behaviour. Children have typically acquired the vocabulary for basic emotions by 4-6 years of age (Baron-Cohen, Golan, Wheelwright, Granader, & Hill, 2010;Ridgeway, Waters, & Kuczaj, 1985), but accuracy in identifying non-verbal emotional cues continues to improve into late adolescence (Grosbras, Ross, & Belin, 2018;Herba & Phillips, 2004;Rodger, Vizioli, Ouyang, & Caldara, 2015). Accurate emotion identification has been linked to positive outcomes later in development, including academic success (Denham et al, 2012;Izard et al, 2001), social integration (Sette, Spinrad, & Baumgartner, 2017) and good mental health (Ciarrochi, Scott, Deane, & Heaven, 2003).…”
Section: Introductionmentioning
confidence: 99%