Are there bi-directional influences between speech perception and music perception? An answer to this question is essential for understanding the extent to which the speech and music that we hear are processed by domain-general auditory processes and/or by distinct neural auditory mechanisms. This review summarizes a large body of behavioral and neuroscientific findings which suggest that the musical experience of trained musicians does modulate speech processing, and a sparser set of data, largely on pitch processing, which suggest in addition that linguistic experience, in particular learning a tone language, modulates music processing. Although research has focused mostly on music on speech effects, we argue that both directions of influence need to be studied, and conclude that the picture which thus emerges is one of mutual interaction across domains. In particular, it is not simply that experience with spoken language has some effects on music perception, and vice versa, but that because of shared domain-general subcortical and cortical networks, experiences in both domains influence behavior in both domains.
Gesture is an integral part of children's communicative repertoire. However, little is known about the neurobiology of speech and gesture integration in the developing brain. We investigated how 8- to 10-year-old children processed gesture that was essential to understanding a set of narratives. We asked whether the functional neuroanatomy of gesture-speech integration varies as a function of (1) the content of speech, and/or (2) individual differences in how gesture is processed. When gestures provided missing information not present in the speech (i.e., disambiguating gesture; e.g., "pet" + flapping palms = bird), the presence of gesture led to increased activity in inferior frontal gyri, the right middle temporal gyrus, and the left superior temporal gyrus, compared to when gesture provided redundant information (i.e., reinforcing gesture; e.g., "bird" + flapping palms = bird). This pattern of activation was found only in children who were able to successfully integrate gesture and speech behaviorally, as indicated by their performance on post-test story comprehension questions. Children who did not glean meaning from gesture did not show differential activation across the two conditions. Our results suggest that the brain activation pattern for gesture-speech integration in children overlaps with-but is broader than-the pattern in adults performing the same task. Overall, our results provide a possible neurobiological mechanism that could underlie children's increasing ability to integrate gesture and speech over childhood, and account for individual differences in that integration.
Children vary greatly in their vocabulary development during preschool years. Importantly, the pace of this early vocabulary growth predicts vocabulary size at school entrance. Despite its importance for later academic success, not much is known about the relation between individual differences in early vocabulary development and later brain structure and function. Here we examined the association between vocabulary growth in children, as estimated from longitudinal measurements from 14 to 58 months, and individual differences in brain structure measured in 3rd and 4th grade (8–10 years old). Our results show that the pace of vocabulary growth uniquely predicts cortical thickness in the left supramarginal gyrus. Probabilistic tractography revealed that this region is directly connected to the inferior frontal gyrus (pars opercularis) and the ventral premotor cortex, via what is most probably the superior longitudinal fasciculus III. Our findings demonstrate, for the first time, the relation between the pace of vocabulary learning in children and a specific change in the structure of the cerebral cortex, specifically, cortical thickness in the left supramarginal gyrus. They also highlight the fact that differences in the pace of vocabulary growth are associated with the dorsal language stream, which is thought to support speech perception and articulation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.