2003
DOI: 10.1016/s0896-6273(03)00199-5
|View full text |Cite
|
Sign up to set email alerts
|

Hemispheric Dissociation in Access to the Human Semantic System

Abstract: Patient studies suggest that speech and environmental sounds are differentially processed by the left and right hemispheres. Here, using functional imaging in normal subjects, we compared semantic processing of spoken words to equivalent processing of environmental sounds, after controlling for low-level perceptual differences. Words enhanced activation in left anterior and posterior superior temporal regions, while environmental sounds enhanced activation in a right posterior superior temporal region. This le… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

8
106
1

Year Published

2007
2007
2022
2022

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 118 publications
(115 citation statements)
references
References 48 publications
8
106
1
Order By: Relevance
“…There is indeed evidence from dichotic listening studies that non-linguistic stimuli are processed more efficiently in the right hemisphere (e.g., [2,19]). Moreover, recent studies suggested right hemisphere predominance in the acoustic and semantic analysis of environmental sounds (e.g., [15,26]). The greater activation of the right hemisphere would thus have favored the occurrence of attentional biases linked to left-presented emotional sounds.…”
Section: Laterality Effectsmentioning
confidence: 99%
See 1 more Smart Citation
“…There is indeed evidence from dichotic listening studies that non-linguistic stimuli are processed more efficiently in the right hemisphere (e.g., [2,19]). Moreover, recent studies suggested right hemisphere predominance in the acoustic and semantic analysis of environmental sounds (e.g., [15,26]). The greater activation of the right hemisphere would thus have favored the occurrence of attentional biases linked to left-presented emotional sounds.…”
Section: Laterality Effectsmentioning
confidence: 99%
“…First, contrary to the arbitrary relationship that the sound pattern of words has to real-world objects or events, for many environmental sounds the mapping with meaning results from the physical properties of the object or event in question [28], which may lead to stronger attentional biases towards the emotional content of sounds. Second, several studies reported partially dissociated brain regions in the higher-order processing of verbal and nonverbal stimuli, including a greater involvement of the left and right hemisphere, respectively [e.g., [15,26]]. Third, the spatial distribution of attention for verbal and nonverbal sounds could not follow the same rules.…”
Section: Introductionmentioning
confidence: 99%
“…Previous studies, however, have mostly focused on actions that we all learn from infancy and that are typically overexperienced, for example, hand clapping (Pizzamiglio et al, 2005), tongue clicking (Hauk et al, 2006), and speech (Fadiga et al, 2002;Wilson et al, 2004;Buccino et al, 2005). One other concern, at least with regard to the sound of speech, is that speech is not representative of all sounds; it carries meaning and is limited to communicative mouth actions, which all together, may activate different types of neural circuits than nonspeech sounds (Pulvermuller, 2001;Zatorre et al, 2002;Thierry et al, 2003;Schon et al, 2005;Ozdemir et al, 2006). Thus, in the present study, we ask whether and how the mirror neuron system will respond to actions and sounds that do not have verbal meaning and, most importantly, are well controlled and newly acquired.…”
Section: Introductionmentioning
confidence: 99%
“…Most of these studies, however, have focused on the distinction between different types of materials (i.e., verbal vs nonverbal), presented through the visual (Bright et al, 2004;Moore and Price, 1999;Vandenberghe et al, 1996;Vandenbulcke et al, 2006) and auditory (Dick et al, 2007;Thierry et al, 2003;Visser and Lambon Ralph, 2011) modalities. It is clear that the inferential versus referential distinction does not exactly map onto the verbal-nonverbal distinction.…”
Section: Spared Referential Impaired Inferential Processingmentioning
confidence: 99%
“…Following this distinction, several studies have highlighted, in addition to a common left lateralized semantic network, material-specific activations, involving left hemispheric regions for verbal stimuli and right hemispheric regions for nonverbal stimuli (Thierry et al, 2003;Thierry and Price, 2006;Vandenberghe et al, 1996 as reanalyzed by Thierry and Price, 2006;Vandenbulcke et al, 2006). Specifically, some authors found that left middle and superior temporal regions were selectively more involved for verbal material, while the right midfusiform and right posterior middle temporal cortex were selectively more involved for nonverbal processing (Thierry and Price, 2006;Vandenberghe et al, 1996 as in Thierry and Price, 2006; and for converging evidence in patients with neurodegenerative pathologies see Butler et al, 2009).…”
Section: Spared Referential Impaired Inferential Processingmentioning
confidence: 99%