Previous neuroimaging studies have suggested that developmental dyslexia has a different neural basis in Chinese and English populations because of known differences in the processing demands of the Chinese and English writing systems. Here, using functional magnetic resonance imaging, we provide the first direct statistically based investigation into how the effect of dyslexia on brain activation is influenced by the Chinese and English writing systems. Brain activation for semantic decisions on written words was compared in English dyslexics, Chinese dyslexics, English normal readers and Chinese normal readers, while controlling for all other experimental parameters. By investigating the effects of dyslexia and language in one study, we show common activation in Chinese and English dyslexics despite different activation in Chinese versus English normal readers. The effect of dyslexia in both languages was observed as less than normal activation in the left angular gyrus and in left middle frontal, posterior temporal and occipitotemporal regions. Differences in Chinese and English normal reading were observed as increased activation for Chinese relative to English in the left inferior frontal sulcus; and increased activation for English relative to Chinese in the left posterior superior temporal sulcus. These cultural differences were not observed in dyslexics who activated both left inferior frontal sulcus and left posterior superior temporal sulcus, consistent with the use of culturally independent strategies when reading is less efficient. By dissociating the effect of dyslexia from differences in Chinese and English normal reading, our results reconcile brain activation results with a substantial body of behavioural studies showing commonalities in the cognitive manifestation of dyslexia in Chinese and English populations. They also demonstrate the influence of cognitive ability and learning environment on a common neural system for reading.
Although interactivity is considered a fundamental principle of cognitive (and computational) models of reading, it has received far less attention in neural models of reading that instead focus on serial stages of feed-forward processing from visual input to orthographic processing to accessing the corresponding phonological and semantic information. In particular, the left ventral occipito-temporal (vOT) cortex is proposed to be the first stage where visual word recognition occurs prior to accessing nonvisual information such as semantics and phonology. We used functional magnetic resonance imaging (fMRI) to investigate whether there is evidence that activation in vOT is influenced top-down by the interaction of visual and nonvisual properties of the stimuli during visual word recognition tasks. Participants performed two different types of lexical decision tasks that focused on either visual or nonvisual properties of the word or word-like stimuli. The design allowed us to investigate how vOT activation during visual word recognition was influenced by a task change to the same stimuli and by a stimulus change during the same task. We found both stimulus- and task-driven modulation of vOT activation that can only be explained by top-down processing of nonvisual aspects of the task and stimuli. Our results are consistent with the hypothesis that vOT acts as an interface linking visual form with nonvisual processing in both bottom up and top down directions. Such interactive processing at the neural level is in agreement with cognitive and computational models of reading but challenges some of the assumptions made by current neuro-anatomical models of reading.
To investigate how hearing status, sign language experience, and task demands influence functional responses in the human superior temporal cortices (STC) we collected fMRI data from deaf and hearing participants (male and female), who either acquired sign language early or late in life. Our stimuli in all tasks were pictures of objects. We varied the linguistic and visuospatial processing demands in three different tasks that involved decisions about (1) the sublexical (phonological) structure of the British Sign Language (BSL) signs for the objects, (2) the semantic category of the objects, and (3) the physical features of the objects.Neuroimaging data revealed that in participants who were deaf from birth, STC showed increased activation during visual processing tasks. Importantly, this differed across hemispheres. Right STC was consistently activated regardless of the task whereas left STC was sensitive to task demands. Significant activation was detected in the left STC only for the BSL phonological task. This task, we argue, placed greater demands on visuospatial processing than the other two tasks. In hearing signers, enhanced activation was absent in both left and right STC during all three tasks. Lateralization analyses demonstrated that the effect of deafness was more task-dependent in the left than the right STC whereas it was more task-independent in the right than the left STC. These findings indicate how the absence of auditory input from birth leads to dissociable and altered functions of left and right STC in deaf participants.SIGNIFICANCE STATEMENT Those born deaf can offer unique insights into neuroplasticity, in particular in regions of superior temporal cortex (STC) that primarily respond to auditory input in hearing people. Here we demonstrate that in those deaf from birth the left and the right STC have altered and dissociable functions. The right STC was activated regardless of demands on visual processing. In contrast, the left STC was sensitive to the demands of visuospatial processing. Furthermore, hearing signers, with the same sign language experience as the deaf participants, did not activate the STCs. Our data advance current understanding of neural plasticity by determining the differential effects that hearing status and task demands can have on left and right STC function.
HighlightsSequential delivery of letters in words encourages the use of phonological assembly.Greater activation in left SMG, POp and precentral gyrus during sequential delivery.Activation for ‘phonological assembly’ not confounded with stimulus properties.Activation for ‘phonological assembly’ not wholly attributable to processing load.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.