This is a PDF file of an article that has undergone enhancements after acceptance, such as the addition of a cover page and metadata, and formatting for readability, but it is not yet the definitive version of record. This version will undergo additional copyediting, typesetting and review before it is published in its final form, but we are providing this version to give early visibility of the article. Please note that, during the production process, errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
How do deaf and deafblind individuals process touch? This question offers a unique model to understand the prospects and constraints of neural plasticity. Our brain constantly receives and processes signals from the environment and combines them into the most reliable information content. The nervous system adapts its functional and structural organization according to the input, and perceptual processing develops as a function of individual experience. However, there are still many unresolved questions regarding the deciding factors for these changes in deaf and deafblind individuals, and so far, findings are not consistent. To date, most studies have not taken the sensory and linguistic experiences of the included participants into account. As a result, the impact of sensory deprivation vs. language experience on somatosensory processing remains inconclusive. Even less is known about the impact of deafblindness on brain development. The resulting neural adaptations could be even more substantial, but no clear patterns have yet been identified. How do deafblind individuals process sensory input? Studies on deafblindness have mostly focused on single cases or groups of late-blind individuals. Importantly, the language backgrounds of deafblind communities are highly variable and include the usage of tactile languages. So far, this kind of linguistic experience and its consequences have not been considered in studies on basic perceptual functions. Here, we will provide a critical review of the literature, aiming at identifying determinants for neuroplasticity and gaps in our current knowledge of somatosensory processing in deaf and deafblind individuals.
To date, the extent to which early experience shapes the functional characteristics of neural circuits is still a matter of debate. In the present study, we tested whether congenital deafness and/or the acquisition of a sign language alter the temporal processing characteristics of the visual system. Moreover, we investigated whether, assuming cross‐modal plasticity in deaf individuals, the temporal processing characteristics of possibly reorganised auditory areas resemble those of the visual cortex. Steady‐state visual evoked potentials (SSVEPs) were recorded in congenitally deaf native signers, hearing native signers, and hearing nonsigners. The luminance of the visual stimuli was periodically modulated at 12, 21, and 40 Hz. For hearing nonsigners, the optimal driving rate was 12 Hz. By contrast, for the group of hearing signers, the optimal driving rate was 12 and 21 Hz, whereas for the group of deaf signers, the optimal driving rate was 21 Hz. We did not observe evidence for cross‐modal recruitment of auditory cortex in the group of deaf signers. These results suggest a higher preferred neural processing rate as a consequence of the acquisition of a sign language.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.