Why is it that people cannot keep their hands still when they talk? One reason may be that gesturing actually lightens cognitive load while a person is thinking of what to say. We asked adults and children to remember a list of letters or words while explaining how they solved a math problem. Both groups remembered significantly more items when they gestured during their math explanations than when they did not gesture. Gesturing appeared to save the speakers' cognitive resources on the explanation task, permitting the speakers to allocate more resources to the memory task. It is widely accepted that gesturing reflects a speaker's cognitive state, but our observations suggest that, by reducing cognitive load, gesturing may also play a role in shaping that state.
Most theories of pragmatics take as the basic unit of communication the verbal content of spoken or written utterances. However, many of these theories have overlooked the fact that important information about an utterance's meaning can be conveyed nonverbally. In the present study, we investigate the pragmatic role that hand gestures play in language comprehension and memory. In Experiments 1 and 2, we found that people were more likely to interpret an utterance as an indirect request when speech was accompanied by a relevant pointing gesture than when speech or gesture was presented alone. Following up on this, Experiment 3 supported the idea that speech and gesture mutually disambiguate the meanings of one another. Finally, Experiment 4 generalized the findings to different types of speech acts (recollection of events) with a different type of gesture (iconic gestures). The results from these experiments suggest that broader units of analysis beyond the verbal message may be needed in studying pragmatic understanding.
Gesture and speech are assumed to form an integrated system during language production. Based on this view, we propose the integrated-systems hypothesis, which explains two ways in which gesture and speech are integrated--through mutual and obligatory interactions--in language comprehension. Experiment 1 presented participants with action primes (e.g., someone chopping vegetables) and bimodal speech and gesture targets. Participants related primes to targets more quickly and accurately when they contained congruent information (speech: "chop"; gesture: chop) than when they contained incongruent information (speech: "chop"; gesture: twist). Moreover, the strength of the incongruence affected processing, with fewer errors for weak incongruities (speech: "chop"; gesture: cut) than for strong incongruities (speech: "chop"; gesture: twist). Crucial for the integrated-systems hypothesis, this influence was bidirectional. Experiment 2 demonstrated that gesture's influence on speech was obligatory. The results confirm the integrated-systems hypothesis and demonstrate that gesture and speech form an integrated system in language comprehension.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.