2020
DOI: 10.1080/23273798.2020.1737719
|View full text |Cite
|
Sign up to set email alerts
|

Evidence for children’s online integration of simultaneous information from speech and iconic gestures: an ERP study

Abstract: Children perceive iconic gestures, along with speech they hear. Previous studies have shown that children integrate information from both modalities. Yet it is not known whether children can integrate both types of information simultaneously as soon as they are available (as adults do) or whether they initially process them separately and integrate them later. Using electrophysiological measures, we examined the online neurocognitive processing of gesturespeech integration in 6-to 7-year-old children. We focus… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
16
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(20 citation statements)
references
References 45 publications
3
16
0
Order By: Relevance
“…Studies showed that 3-year olds could not integrate speech and gesture, whereas 5-year old and adults did (e.g., Sekine and Kita, 2015 ; Sekine et al, 2015 ). Moreover, children starting from 6 years of age integrate speech and gesture in an online fashion comparable to adults ( Dick et al, 2012 ; Sekine et al, in press ). Demir-Lira et al (2018) showed that gesture-speech integration recruits the same neural network as in adults.…”
Section: Individual Differences In Gesture Processingmentioning
confidence: 99%
“…Studies showed that 3-year olds could not integrate speech and gesture, whereas 5-year old and adults did (e.g., Sekine and Kita, 2015 ; Sekine et al, 2015 ). Moreover, children starting from 6 years of age integrate speech and gesture in an online fashion comparable to adults ( Dick et al, 2012 ; Sekine et al, in press ). Demir-Lira et al (2018) showed that gesture-speech integration recruits the same neural network as in adults.…”
Section: Individual Differences In Gesture Processingmentioning
confidence: 99%
“…With electroencephalography (EEG) methods such as event‐related potentials (ERPs), it has been suggested that the human brain is able to rapidly integrate semantic representations from gesture and language, as reflected by the N400 component. The N400 has been consistently observed for a variety of experimental manipulations, for example, semantic mismatch or semantic disambiguity (Holle & Gunter, 2007; Özyürek, Willems, Kita, & Hagoort, 2007), both for language stimuli that were presented either in auditory or visual form (Fabbri‐Destro et al, 2015; Özyürek et al, 2007), and for gesture and speech stimuli that are presented simultaneously or consecutively, across adults and children (Fabbri‐Destro et al, 2015; Habets, Kita, Shao, Özyurek, & Hagoort, 2011; Kelly, Kravitz, & Hopkins, 2004; Sekine et al, 2020; Wu & Coulson, 2005). Of note, despite consistent reports (see Özyürek, 2014 for review), it remains an open question whether these N400 effects reflect the cost of semantically integrating gesture and language, or the differential level of semantic prediction from gesture to language and vice versa, or a combination of both processes.…”
Section: Introductionmentioning
confidence: 99%
“…To the best of the authors’ knowledge, only one electrophysiological study investigating gesture-speech integration in children has been conducted. In their study, Sekine et al (2020) observed a larger N400 component for the incongruent trials compared to congruent ones. In line with data from behavioral studies on the development of gesture-speech integration in children ( Stanfield et al, 2013 ; Sekine et al, 2015 ; Glasser et al, 2018 ), this study suggests that by the age of 6, children possess a qualitatively similar processing of gesture-speech information to adults.…”
Section: Investigating the Relationship Between Iconic Gestures And Languagementioning
confidence: 84%