“…With electroencephalography (EEG) methods such as event‐related potentials (ERPs), it has been suggested that the human brain is able to rapidly integrate semantic representations from gesture and language, as reflected by the N400 component. The N400 has been consistently observed for a variety of experimental manipulations, for example, semantic mismatch or semantic disambiguity (Holle & Gunter, 2007; Özyürek, Willems, Kita, & Hagoort, 2007), both for language stimuli that were presented either in auditory or visual form (Fabbri‐Destro et al, 2015; Özyürek et al, 2007), and for gesture and speech stimuli that are presented simultaneously or consecutively, across adults and children (Fabbri‐Destro et al, 2015; Habets, Kita, Shao, Özyurek, & Hagoort, 2011; Kelly, Kravitz, & Hopkins, 2004; Sekine et al, 2020; Wu & Coulson, 2005). Of note, despite consistent reports (see Özyürek, 2014 for review), it remains an open question whether these N400 effects reflect the cost of semantically integrating gesture and language, or the differential level of semantic prediction from gesture to language and vice versa, or a combination of both processes.…”