Purpose: Coverbal gesture use, which is affected by the presence and degree of aphasia, can be culturally specific. The purpose of this study was to compare gesture use among Cantonese-speaking individuals: 23 neurologically healthy speakers, 23 speakers with fluent aphasia, and 21 speakers with nonfluent aphasia. Method: Multimedia data of discourse samples from these speakers were extracted from the Cantonese AphasiaBank. Gestures were independently annotated on their forms and functions to determine how gesturing rate and distribution of gestures differed across speaker groups. A multiple regression was conducted to determine the most predictive variable(s) for gesture-to-word ratio.Results: Although speakers with nonfluent aphasia gestured most frequently, the rate of gesture use in counterparts with fluent aphasia did not differ significantly from controls. Different patterns of gesture functions in the 3 speaker groups revealed that gesture plays a minor role in lexical retrieval whereas its role in enhancing communication dominates among the speakers with aphasia. The percentages of complete sentences and dysfluency strongly predicted the gesturing rate in aphasia. Conclusions: The current results supported the sketch model of language-gesture association. The relationship between gesture production and linguistic abilities and clinical implications for gesture-based language intervention for speakers with aphasia are also discussed.G esture is the most common form of nonverbal behavior accompanying human communication. It is defined as spontaneous movements of hands and arms that flow simultaneously with speech (Kendon, 1980). A considerable number of studies have suggested that gestures are communicatively intended (e.g., de Ruiter, 2000;McNeill, 1992) and can enhance everyday social interaction. Moreover, coverbal gestures can supplement the semantic content of oral output (Kendon, 2000) through their representation of spatial/directional and dynamic aspects of language content. Beattie and Shovelton (1999) further supported the supplementary role gestures play in human conversation by arguing that listeners could obtain more semantic information when a conversational partner had used both language and gestures (as compared with the language-only condition) within a verbal exchange task.The view that gestures can assist lexical retrieval during oral production was illustrated in the lexical retrieval hypothesis (LRH) reported by Krauss and Hadar (1999). To be more specific, it was proposed that the spatial and dynamic features of a concept represented by gestures could activate word retrieval. Gillespie, James, Federmeier, and Watson (2014) further suggested that impaired language competency, such as speech dysfluency or word-finding difficulty, was a primary driving force for use of gestures. Additional evidence echoing the LRH was provided by Chawla and Krauss (1994), who demonstrated that unimpaired speakers used gestures more frequently during spontaneous speech as compared with rehearsed speech th...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.