Emotional information can be conveyed by various means of communication, such as propositional content, speech intonation, facial expression, and gestures. Prior studies have demonstrated that inputs from one modality can alter perception in another modality. To evaluate the impact of emotional intonation on ratings of emotional faces, a behavioral study first was carried out. Second, functional magnetic resonance (fMRI) was used to identify brain regions that mediate crossmodal effects of emotional prosody on judgments of facial expressions. In the behavioral study, subjects rated fearful and neutral facial expressions as being more fearful when accompanied by a fearful voice as compared to the same facial expressions without concomitant auditory stimulus, whereas no such influence on rating of faces was found for happy voices. In the fMRI experiment, this shift in rating of facial expressions in presence of a fearfully spoken sentence was correlated with the hemodynamic response in the left amygdala extending into the periamygdaloid cortex, which suggests that crossmodal effects on cognitive judgments of emotional information are mediated via these neuronal structures. Furthermore, significantly stronger activations were found in the mid-portion of the right fusiform gyrus during judgment of facial expressions in presence of fearful as compared to happy intonations, indicating that enhanced processing of faces within this region can be induced by the presence of threat-related information perceived via the auditory modality. Presumably, these increased extrastriate activations correspond to enhanced alertness, whereas responses within the left amygdala modulate cognitive evaluation of emotional facial expressions.
Functional magnetic resonance imaging was used to investigate hemodynamic responses to adjectives pronounced in happy and angry intonations of varying emotional intensity. In separate sessions, participants judged the emotional valence of either intonation or semantics. To disentangle effects of emotional prosodic intensity from confounding acoustic parameters, mean and variability of volume and fundamental frequency of each stimulus were included as nuisance variables in the statistical models. A linear dependency between hemodynamic responses and emotional intensity of happy and angry intonations was found in the bilateral superior temporal sulcus during both tasks, indicating that increases of hemodynamic responses in this region are elicited by both positive and negative prosody independent of low-level acoustic properties and task instructions.
Our neuropsychological findings suggest that episodic autobiographical memory is affected in long-term patients with SPMS, possibly due to neurodegenerative processes in functional relevant brain regions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.