2014
DOI: 10.1037/a0036281
|View full text |Cite
|
Sign up to set email alerts
|

Synchronization of speech and gesture: Evidence for interaction in action.

Abstract: Language and action systems are highly interlinked. A critical piece of evidence is that speech and its accompanying gestures are tightly synchronized. Five experiments were conducted to test 2 hypotheses about the synchronization of speech and gesture. According to the interactive view, there is continuous information exchange between the gesture and speech systems, during both their planning and execution phases. According to the ballistic view, information exchange occurs only during the planning phases of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

5
50
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 45 publications
(55 citation statements)
references
References 27 publications
5
50
0
Order By: Relevance
“…It has been clearly shown that the interaction between gestures and speech can occur during both their planning and execution phases (e.g., Chu & Hagoort, 2014;Kita & Özyürek, 2003).…”
Section: Comparison Of the Action Generation Hypothesis With Other Hymentioning
confidence: 99%
“…It has been clearly shown that the interaction between gestures and speech can occur during both their planning and execution phases (e.g., Chu & Hagoort, 2014;Kita & Özyürek, 2003).…”
Section: Comparison Of the Action Generation Hypothesis With Other Hymentioning
confidence: 99%
“…Implications for models of speech and gesture production Over the years, various models of speech and gesture production have been proposed, including Krauss, Chen and Gottesman's (2000) Process model, Kita and Özyürek's (2003) Interface model, de Ruiter's (2000) Sketch model, and McNeill and Duncan's (2000) Growth Point theory (see e.g., Chu & Hagoort, 2014;Hostetter & Alibali, 2008;Wagner, Malisz, & Kopp, 2014, for recent comparisons and discussion). These models all seek to describe how speakers produce multimodal utterances and are concerned with issues such as the timing and integration of gesture and speech, and the role that gestures play in communication.…”
Section: On the Effects Of Visibilitymentioning
confidence: 99%
“…Finally, in the third stage, the utterance plan is phonologically encoded and articulated, resulting in overt, auditory speech. Models of gesture production typically involve two stages: a Motor Planning stage, sometimes referred to as the Gesture Planner or the Action Generator, during which the motor instructions are produced, and a Motor Execution stage, during which these programs are executed, resulting in overt, visible gestures (Chu & Hagoort, 2014;Wagner et al, 2014).…”
Section: On the Effects Of Visibilitymentioning
confidence: 99%
“…The finding suggests that the fine motor articulation required for grasping is processed similarly by both hand and mouth in humans, thus they tend to complement each other. In fact, so tightly are the two motor systems entwined that when either gesture or speech is disrupted the other becomes delayed (Chu & Hagoort, 2014).…”
Section: Introductionmentioning
confidence: 99%