2017
DOI: 10.3390/brainsci7060060
|View full text |Cite
|
Sign up to set email alerts
|

Electrophysiological Indices of Audiovisual Speech Perception in the Broader Autism Phenotype

Abstract: When a speaker talks, the consequences of this can both be heard (audio) and seen (visual). A novel visual phonemic restoration task was used to assess behavioral discrimination and neural signatures (event-related potentials, or ERP) of audiovisual processing in typically developing children with a range of social and communicative skills assessed using the social responsiveness scale, a measure of traits associated with autism. An auditory oddball design presented two types of stimuli to the listener, a clea… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
19
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
8
1

Relationship

2
7

Authors

Journals

citations
Cited by 13 publications
(20 citation statements)
references
References 42 publications
(47 reference statements)
1
19
0
Order By: Relevance
“…Several publications have reported deterioration in speech perception in patients with ASDs. It has been advocated to be a component of the global sensory deficit ( 32 , 33 , 34 ). One interesting hypothesis regarding this issue is the integration deficit of the auditory and visual speech information.…”
Section: Central Auditory Processing Disorders and Autism Spectrum DImentioning
confidence: 99%
“…Several publications have reported deterioration in speech perception in patients with ASDs. It has been advocated to be a component of the global sensory deficit ( 32 , 33 , 34 ). One interesting hypothesis regarding this issue is the integration deficit of the auditory and visual speech information.…”
Section: Central Auditory Processing Disorders and Autism Spectrum DImentioning
confidence: 99%
“…For many years, there has been evidence that visual information about speech influences what listeners hear, including increasing identification of the speech signal in the context of background noise ( Sumby and Pollack, 1954 ; Grant et al, 1998 ). This influence of visual speech has been reported for a wide range of ages ( Irwin et al, 2017a , b ) for persons’ with typical and reduced hearing ( Sommers et al, 2005 ), for clinical populations such as for persons with autism ( Kuhl et al, 2005 ; Stevenson et al, 2014 ; Irwin et al, 2022 ), and for nonnative speakers of English ( Reisberg et al, 1987 ). The presence of visual articulatory information can also facilitate the perception of heard speech, speeding up cortical processing of the auditory signal ( van Wassenhove et al, 2005 ) and facilitating language processing ( MacDonald et al, 2000 ; Lachs and Pisoni, 2004 ).…”
Section: Introductionmentioning
confidence: 79%
“…Research on audiovisual communication has extended into child populations with similar results. Irwin et al (2017a , b) demonstrated that typically-developing children between the ages of 5–10 years show increased gaze to the mouth of a speaker in a range of audiovisual environments, including audiovisual speech, audiovisual speech in noise, and in an audiovisual mismatch condition.…”
Section: Introductionmentioning
confidence: 99%
“…Second, research in the blue cohort concentrated on studies of cognitive processes in speech perception, such as audio-visual speech perception with application of the state-of-art brain science technologies ( Sato et al, 2010 ; Irwin et al, 2017 ). For example, Irwin et al (2017) tested typical developing children’s ability to detect the missing segment in spoken syllables with and without visual information and measured their behaviour and neural activities. They found that visual information attenuated the brain responses, suggesting that children were less sensitive to the missing segment when visual information of a face producing that segment was presented.…”
Section: Resultsmentioning
confidence: 99%