Unilateral aural stimulation has been shown to cause massive cortical reorganization in brain with congenital deafness, particularly during the sensitive period of brain development. However, it is unclear which side of stimulation provides most advantages for auditory development. The left hemisphere dominance of speech and linguistic processing in normal hearing adult brain has led to the assumption of functional and developmental advantages of right over left implantation, but existing evidence is controversial. To test this assumption and provide evidence for clinical choice, we examined 34 prelingually deaf children with unilateral cochlear implants using near-infrared spectroscopy. While controlling for age of implantation, residual hearing, and dominant hand, cortical processing of speech showed neither developmental progress nor influence of implantation side weeks to months after implant activation. In sharp contrast, for nonspeech (music signal vs. noise) processing, left implantation showed functional advantages over right implantation that were not yet discernable using clinical, questionnaire-based outcome measures. These findings support the notion that the right hemisphere develops earlier and is better preserved from adverse environmental influences than its left counterpart. This study thus provides, to our knowledge, the first evidence for differential influences of left and right auditory peripheral stimulation on early cortical development of the human brain.
Speech perception depends on the dynamic interplay of bottom-up and top-down information along a hierarchically organized cortical network. Here, we test, for the first time in the human brain, whether neural processing of attended speech is dynamically modulated by task demand using a context-free discrimination paradigm. Electroencephalographic signals were recorded during 3 parallel experiments that differed only in the phonological feature of discrimination (word, vowel, and lexical tone, respectively). The event-related potentials (ERPs) revealed the task modulation of speech processing at approximately 200 ms (P2) after stimulus onset, probably influencing what phonological information to retain in memory. For the phonological comparison of sequential words, task modulation occurred later at approximately 300 ms (N3 and P3), reflecting the engagement of task-specific cognitive processes. The ERP results were consistent with the changes in delta-theta neural oscillations, suggesting the involvement of cortical tracking of speech envelopes. The study thus provides neurophysiological evidence for goal-oriented modulation of attended speech and calls for speech perception models incorporating limited memory capacity and goal-oriented optimization mechanisms.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.