2006
DOI: 10.1016/s0079-6123(06)56012-1
|View full text |Cite
|
Sign up to set email alerts
|

Intonation as an interface between language and affect

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
75
0
4

Year Published

2009
2009
2020
2020

Publication Types

Select...
6
4

Relationship

3
7

Authors

Journals

citations
Cited by 97 publications
(79 citation statements)
references
References 23 publications
0
75
0
4
Order By: Relevance
“…More generally, music and speech are both auditory signals that acquire meaning through changes in attributes such as pitch, timing, intensity, and timbre. Prosody refers to all suprasegmental changes that occur in the course of an utterance (40,41). Linguists generally differentiate between two prosodic phenomena, namely linguistic prosody, which belongs to the language itself, and emotional prosody, which depends on the emotional state of the speaker.…”
mentioning
confidence: 99%
“…More generally, music and speech are both auditory signals that acquire meaning through changes in attributes such as pitch, timing, intensity, and timbre. Prosody refers to all suprasegmental changes that occur in the course of an utterance (40,41). Linguists generally differentiate between two prosodic phenomena, namely linguistic prosody, which belongs to the language itself, and emotional prosody, which depends on the emotional state of the speaker.…”
mentioning
confidence: 99%
“…We also include discussions of Neuroscience of emotion recognition 9 data regarding emotional prosody recognition in sections where these enable a more complete understanding of the role of a particular region in emotion recognition processes. (For recent, more comprehensive reviews of emotion recognition from vocal cues, see Bachorowski &Owren, 2008 andGrandjean, Banziger, &Scherer, 2006.) We will focus particularly on three anatomical structures or regions: the amygdala and right somatosensory cortex, primarily because their roles in emotion recognition based on other nonverbal cues have been explored, and the face-and body-selective areas of occipitotemporal cortex.…”
Section: Simulation or Shared-substrates Models Of Emotion Recognitionmentioning
confidence: 99%
“…For example, birds and mammals use harsher (covering a wider-frequency band, as opposed to pure-tone-like), lower-frequency vocalisations in hostile contexts compared to friendly settings (Morton, 1977). It has been proposed that these primitive aspects of affective expression have been retained even after the evolutionary transition from basic animal vocalisation to the more sophisticated human language (Grandjean et al, 2006). For example, the physiological changes induced by emotional arousal (e.g.…”
Section: Introductionmentioning
confidence: 99%