In this selective review, I outline a number of ways in which seeing the talker affects auditory perception of speech, including, but not confined to, the McGurk effect. To date, studies suggest that all linguistic levels are susceptible to visual influence, and that two main modes of processing can be described: a complementary mode, whereby vision provides information more efficiently than hearing for some under-specified parts of the speech stream, and a correlated mode, whereby vision partially duplicates information about dynamic articulatory patterning.Cortical correlates of seen speech suggest that at the neurological as well as the perceptual level, auditory processing of speech is affected by vision, so that 'auditory speech regions' are activated by seen speech. The processing of natural speech, whether it is heard, seen or heard and seen, activates the perisylvian language regions (leftOright). It is highly probable that activation occurs in a specific order. First, superior temporal, then inferior parietal and finally inferior frontal regions (leftOright) are activated. There is some differentiation of the visual input stream to the core perisylvian language system, suggesting that complementary seen speech information makes special use of the visual ventral processing stream, while for correlated visual speech, the dorsal processing stream, which is sensitive to visual movement, may be relatively more involved.
This thesis examines selective attention in young adults with Autism Spectrum Disorder (ASD). Existing literature regarding this issue is mixed; some research suggesting an overly-focused attentional style (Rincover & Ducharme, 1987) while others highlight an abnormally broad attentional lens (Burack, 1994).The research presented here has, for the first time, examined selective attention in individuals with ASD using a theoretically-led approach based on Lavie's Load Theory of attention and cognitive control (Lavie et al., 2004). Load theory states that the perceptual load (amount of potentially task relevant information) of a task affects selective attention. This theory may explain the equivocal findings in the current data on selective attention and ASD.Using behavioural measures, the pattern of selective attention under various levels of load was explored in individuals with ASD and matched controls. The results provide evidence of increased perceptual capacity in ASD. This means that, at any one time, individuals with ASD may be able to process more information from the visual environment. This increase in capacity was evident on tasks of both unconscious and conscious perception.In light of the social deficits observed in the condition, the work in this thesis also explored selective attention in the presence of social distractor stimuli. Results indicated that faces are less salient for individuals with ASD and, unlike for typical adults, are not processed in an automatic and mandatory fashion. These results bring together findings on selective attention with work on social processing in an attempt to find basic abnormalities which might be fundamental in explaining the disorder. 4For Guy 5
Recent findings suggest that children with autism may be impaired in the perception of biological motion from moving point-light displays. Some children with autism also have abnormally high motion coherence thresholds. In the current study we tested a group of children with autism and a group of typically developing children aged 5 to 12 years of age on several motion perception tasks, in order to establish the specificity of the biological motion deficit in relation to other visual discrimination skills. The first task required the recognition of biological from scrambled motion. Three quasi-psychophysical tasks then established individual thresholds for the detection of biological motion in dynamic noise, of motion coherence and of form-from-motion. Lastly, individual thresholds for a task of static perception--contour integration (Gabor displays)--were also obtained. Compared to controls, children with autism were particularly impaired in processing biological motion in relation to any developmental measure (chronological or mental age). In contrast, there was some developmental overlap in ability to process other types of visual motion between typically developing children and the children with autism, and evidence of developmental change in both groups. Finally, Gabor display thresholds appeared to develop typically in children with autism.
Purpose-We describe the development of a new Test of Child Speechreading (ToCS) specifically designed for use with deaf and hearing children. Speechreading is a skill which is required for deaf children to access the language of the hearing community. ToCS is a deaffriendly, computer-based test that measures child speechreading (silent lipreading) at three psycholinguistic levels: words, sentences and short stories. The aims of the study were to standardize ToCS with deaf and hearing children and investigate the effects of hearing status, age and linguistic complexity on speechreading ability.Method-86 severely and profoundly deaf and 91 hearing children aged between 5 and 14 years participated. The deaf children were from a range of language and communication backgrounds and their preferred mode of communication varied. Results: Speechreading skills significantly improved with age for both deaf and hearing children. There was no effect of hearing status on speechreading ability and deaf and hearing showed similar performance across all subtests on ToCS. Conclusions-TheTest of Child Speechreading (ToCS) is a valid and reliable assessment of speechreading ability in school-aged children that can be used to measure individual differences in performance in speechreading ability.Typical face-to-face communication is multi-modal and speech perception involves the integration of both auditory and visual information (Rosenblum, 2005). The integration of visual and auditory speech seems to occur very early on as young babies are not only sensitive to the visual component of speech (e.g. Dodd & Burnham, 1988;Kuhl & Meltzoff, 1982;Patterson & Werker, 1999) but can detect visual-auditory synchronisation (Dodd, 1979) and even match visual-auditory vowels (Patterson & Werker, 2003) Europe PMC Funders Author ManuscriptsEurope PMC Funders Author Manuscripts 1976). Importantly, McGurk effects have been observed in infants as young as 4.5 months using classic habituation and dishabituation paradigms (Burnham & Dodd, 2004;Rosenblum, Schmuckler, & Johnson, 1997). This suggests that visual speech contributes to speech processing even in pre-lingual children; thereby strengthening the argument that speechreading (visual-alone speech perception) is a natural part of speech processing (e.g. Massaro, 1987). Further support can also be found in recent evidence from neuroimaging studies suggesting that silent speechreading activates similar neural circuitry as audio-visual speech (e.g. Calvert, et al., 1997;Pekkola, et al., 2005).For many deaf and hearing-impaired individuals, speechreading is the main access to the spoken language of the hearing community and yet historically hearing people have often been reported as having at least equivalent, if not better, speechreading skills than deaf individuals (e.g. Arnold & Kopsel, 1996;Conrad, 1977;Green, Green, & Holmes, 1981;Massaro, 1987; Mogford, 1987). Most of these speechreading assessments were either designed to be used with hearing individuals and therefore contained complex...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.