Communicative hand gestures are often coordinated with prosodic aspects of speech, and salient moments of gestural movement (e.g., quick changes in speed) often co-occur with salient moments in speech (e.g., near peaks in fundamental frequency and intensity). A common understanding is that such gesture and speech coordination is culturally and cognitively acquired, rather than having a biological basis. Recently, however, the biomechanical physical coupling of arm movements to speech movements has been identified as a potentially important factor in understanding the emergence of gesture-speech coordination. Specifically, in the case of steady-state vocalization and mono-syllable utterances, forces produced during gesturing are transferred onto the tensioned body, leading to changes in respiratory-related activity and thereby affecting vocalization F0 and intensity. In the current experiment (N = 37), we extend this previous line of work to show that gesture-speech physics impacts fluent speech, too. Compared with non-movement, participants who are producing fluent self-formulated speech, while rhythmically moving their limbs, demonstrate heightened F0 and amplitude envelope, and such effects are more pronounced for higher-impulse arm versus lower-impulse wrist movement. We replicate that acoustic peaks arise especially during moments of peak-impulse (i.e., the beat) of the movement, namely around deceleration phases of the movement. Finally, higher deceleration rates of higher-mass arm movements were related to higher peaks in acoustics. These results confirm a role for physical-impulses of gesture affecting the speech system. We discuss the implications of gesture-speech physics for understanding of the emergence of communicative gesture, both ontogenetically and phylogenetically.
Gestures and speech are clearly synchronized in many ways. However, previous studies have shown the semantic similarity between gestures and speech breaks down as people approach transitions in understanding. Explanations for these gesture-speech mismatches which focus on gestures and speech expressing different cognitive strategies, have been criticized for disregarding gestures’ and speech’s integration and synchronization. In the current study, we applied three different perspectives to investigate gesture-speech synchronization in an easy and a difficult task: temporal alignment, semantic similarity, and complexity matching. Participants engaged in a simple cognitive task, and were assigned to either an easy or a difficult condition. We automatically measured pointing gestures, and we coded participant’s speech, to determine the temporal alignment and semantic similarity between gestures and speech. Multifractal Detrended Fluctuation Analysis (MFDFA) was used to determine the extent of complexity matching between gestures and speech. We found that task difficulty indeed influenced gesture-speech synchronization in all three domains. We thereby extended the phenomenon of gesture-speech mismatches to difficult tasks in general. Furthermore, we investigated how temporal alignment, semantic similarity, and complexity matching were related in each condition, and how they predicted participants’ task performance. Our study illustrates how combining multiple perspectives, originating from different research areas (i.e., coordination dynamics, complexity science, cognitive psychology) provides novel understanding about cognitive concepts in general, and about gesture-speech synchronization and task difficulty in specific.
Children move their hands to explore, learn and communicate about hands-on tasks. Their hand movements seem to be “learning” ahead of speech. Children shape their hand movements in accordance with spatial and temporal task properties, such as when they feel an object or simulate it’s movements. Their speech does not directly correspond to these spatial and temporal task properties, however. We aimed to understand whether and how hand movements’ are leading cognitive development due to their ability to correspond to spatiotemporal task properties, while speech is unable to do so. We explored whether hand movements’ and speech’s variability changed with a change in spatiotemporal task properties, using two variability measures: Diversity indicates adaptation, while Complexity indicates flexibility. In two experiments, we asked children (4-7 years) to predict and explain about balance scale problems, whereby we either manipulated the length of the balance scale or the mass of the weights after half of the trials. In three out of four conditions, we found a change in Complexity for both hand movements and speech between first and second half of the task. In one of these conditions, we found a relation between the differences in Complexity and Diversity of hand movements and speech. Changes in spatiotemporal task properties thus often influenced both hand movements’ and speech’s flexibility, but there seem to be differences in how they did so. We provided many directions for future research, to further unravel the relations between hand movements, speech, task properties, variability, and cognitive development.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.