Although sensory processing challenges have been noted since the first clinical descriptions of autism, it has taken until the release of the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5) in 2013 for sensory problems to be included as part of the core symptoms of autism spectrum disorder (ASD) in the diagnostic profile. Because sensory information forms the building blocks for higher-order social and cognitive functions, we argue that sensory processing is not only an additional piece of the puzzle, but rather a critical cornerstone for characterizing and understanding ASD. In this review we discuss what is currently known about sensory processing in ASD, how sensory function fits within contemporary models of ASD, and what is understood about the differences in the underlying neural processing of sensory and social communication observed between individuals with and without ASD. In addition to highlighting the sensory features associated with ASD, we also emphasize the importance of multisensory processing in building perceptual and cognitive representations, and how deficits in multisensory integration may also be a core characteristic of ASD.
Over the next two decades, a dramatic shift in the demographics of society will take place, with a rapid growth in the population of older adults. One of the most common complaints with healthy aging is a decreased ability to successfully perceive speech, particularly in noisy environments. In such noisy environments, the presence of visual speech cues (i.e., lip movements) provide striking benefits for speech perception and comprehension, but previous research suggests that older adults gain less from such audiovisual integration than their younger peers. To determine at what processing level these behavioral differences arise in healthy-aging populations, we administered a speech-in-noise task to younger and older adults. We compared the perceptual benefits of having speech information available in both the auditory and visual modalities and examined both phoneme and whole-word recognition across varying levels of signal-to-noise ratio (SNR). For whole-word recognition, older relative to younger adults showed greater multisensory gains at intermediate SNRs, but reduced benefit at low SNRs. By contrast, at the phoneme level both younger and older adults showed approximately equivalent increases in multisensory gain as SNR decreased. Collectively, the results provide important insights into both the similarities and differences in how older and younger adults integrate auditory and visual speech cues in noisy environments, and help explain some of the conflicting findings in previous studies of multisensory speech perception in healthy aging. These novel findings suggest that audiovisual processing is intact at more elementary levels of speech perception in healthy aging populations, and that deficits begin to emerge only at the more complex, word-recognition level of speech signals.
Lay Abstract In a noisy environment it can often be difficult to understand what a speaker is saying, but being able to see the speaker’s mouth makes it easier. In fact, the noisier the environment, the more it helps to both see and hear a speaker. This is due to a process known as multisensory integration, where information from two senses, in this case audition and vision, are combined. Autistic children often have difficulties with multisensory integration, which may add to difficulties with social communication. We tested how well autistic and typically developing children integrated auditory speech and visual speech, both at the level of whole words and also at the level of individual speech sounds, or phonemes. At both the whole-word and the phoneme level, autistic children were less able to correctly identify speech sounds, regardless of how they were presented (audio-only, visual-only, audiovisual). Autistic children showed equal benefit in phoneme perception from seeing the mouth movements accompanying the auditory speech. In noisy environments however, these equal gains in phoneme perception did not extend to whole-word recognition, where autistic individuals showed reduced benefit from seeing the mouth movements while hearing auditory speech. Such reductions in speech perception abilities, including a reduced benefit from the presence of visual speech cues in noisy environments, may contribute to social communication difficulties in autistic children. Scientific Abstract Speech perception in noisy environments is boosted when a listener can see the speaker’s mouth and integrate the auditory and visual speech information. Autistic children have a diminished capacity to integrate sensory information across modalities, which contributes to core symptoms of autism, such as impairments in social communication. We investigated the abilities of autistic and typically-developing (TD) children to integrate auditory and visual speech stimuli in various levels of noise. Measurements of both whole-word and phoneme recognition were recorded. At the level of whole-word recognition, autistic children exhibited reduced performance in both the auditory and audiovisual modalities. Importantly, autistic children showed reduced behavioral benefit from multisensory integration with whole-word recognition, specifically at low signal to noise ratios (SNRs). At the level of phoneme recognition, autistic children exhibited reduced performance relative to their TD peers in auditory, visual, and audiovisual modalities. However, and in contrast to their performance at the level of whole-word recognition, both autistic and TD children showed benefits from multisensory integration for phoneme recognition. In accordance with the principle of inverse effectiveness, both groups exhibited greater benefit at low SNRs relative to high SNRs. Thus, while autistic children showed typical multisensory benefits during phoneme recognition, these benefits did not translate to typical multisensory benefit of whole-word recognition in noisy e...
Electrical stimulation of visual cortex can produce a visual percept (phosphene). We electrically stimulated visual cortex in human patients implanted with subdural electrodes while recording from other brain sites. Across experimental manipulations, we found that phosphene perception occurred only if stimulation evoked high-frequency gamma oscillations in the temporoparietal junction (TPJ), a brain region associated with visual extinction and neglect. Electrical stimulation of TPJ modified detectability of low-contrast visual stimuli.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.