Search citation statements
Paper Sections
Citation Types
Year Published
Publication Types
Relationship
Authors
Journals
Purpose: Coverbal gesture use, which is affected by the presence and degree of aphasia, can be culturally specific. The purpose of this study was to compare gesture use among Cantonese-speaking individuals: 23 neurologically healthy speakers, 23 speakers with fluent aphasia, and 21 speakers with nonfluent aphasia. Method: Multimedia data of discourse samples from these speakers were extracted from the Cantonese AphasiaBank. Gestures were independently annotated on their forms and functions to determine how gesturing rate and distribution of gestures differed across speaker groups. A multiple regression was conducted to determine the most predictive variable(s) for gesture-to-word ratio.Results: Although speakers with nonfluent aphasia gestured most frequently, the rate of gesture use in counterparts with fluent aphasia did not differ significantly from controls. Different patterns of gesture functions in the 3 speaker groups revealed that gesture plays a minor role in lexical retrieval whereas its role in enhancing communication dominates among the speakers with aphasia. The percentages of complete sentences and dysfluency strongly predicted the gesturing rate in aphasia. Conclusions: The current results supported the sketch model of language-gesture association. The relationship between gesture production and linguistic abilities and clinical implications for gesture-based language intervention for speakers with aphasia are also discussed.G esture is the most common form of nonverbal behavior accompanying human communication. It is defined as spontaneous movements of hands and arms that flow simultaneously with speech (Kendon, 1980). A considerable number of studies have suggested that gestures are communicatively intended (e.g., de Ruiter, 2000;McNeill, 1992) and can enhance everyday social interaction. Moreover, coverbal gestures can supplement the semantic content of oral output (Kendon, 2000) through their representation of spatial/directional and dynamic aspects of language content. Beattie and Shovelton (1999) further supported the supplementary role gestures play in human conversation by arguing that listeners could obtain more semantic information when a conversational partner had used both language and gestures (as compared with the language-only condition) within a verbal exchange task.The view that gestures can assist lexical retrieval during oral production was illustrated in the lexical retrieval hypothesis (LRH) reported by Krauss and Hadar (1999). To be more specific, it was proposed that the spatial and dynamic features of a concept represented by gestures could activate word retrieval. Gillespie, James, Federmeier, and Watson (2014) further suggested that impaired language competency, such as speech dysfluency or word-finding difficulty, was a primary driving force for use of gestures. Additional evidence echoing the LRH was provided by Chawla and Krauss (1994), who demonstrated that unimpaired speakers used gestures more frequently during spontaneous speech as compared with rehearsed speech th...
Purpose: Coverbal gesture use, which is affected by the presence and degree of aphasia, can be culturally specific. The purpose of this study was to compare gesture use among Cantonese-speaking individuals: 23 neurologically healthy speakers, 23 speakers with fluent aphasia, and 21 speakers with nonfluent aphasia. Method: Multimedia data of discourse samples from these speakers were extracted from the Cantonese AphasiaBank. Gestures were independently annotated on their forms and functions to determine how gesturing rate and distribution of gestures differed across speaker groups. A multiple regression was conducted to determine the most predictive variable(s) for gesture-to-word ratio.Results: Although speakers with nonfluent aphasia gestured most frequently, the rate of gesture use in counterparts with fluent aphasia did not differ significantly from controls. Different patterns of gesture functions in the 3 speaker groups revealed that gesture plays a minor role in lexical retrieval whereas its role in enhancing communication dominates among the speakers with aphasia. The percentages of complete sentences and dysfluency strongly predicted the gesturing rate in aphasia. Conclusions: The current results supported the sketch model of language-gesture association. The relationship between gesture production and linguistic abilities and clinical implications for gesture-based language intervention for speakers with aphasia are also discussed.G esture is the most common form of nonverbal behavior accompanying human communication. It is defined as spontaneous movements of hands and arms that flow simultaneously with speech (Kendon, 1980). A considerable number of studies have suggested that gestures are communicatively intended (e.g., de Ruiter, 2000;McNeill, 1992) and can enhance everyday social interaction. Moreover, coverbal gestures can supplement the semantic content of oral output (Kendon, 2000) through their representation of spatial/directional and dynamic aspects of language content. Beattie and Shovelton (1999) further supported the supplementary role gestures play in human conversation by arguing that listeners could obtain more semantic information when a conversational partner had used both language and gestures (as compared with the language-only condition) within a verbal exchange task.The view that gestures can assist lexical retrieval during oral production was illustrated in the lexical retrieval hypothesis (LRH) reported by Krauss and Hadar (1999). To be more specific, it was proposed that the spatial and dynamic features of a concept represented by gestures could activate word retrieval. Gillespie, James, Federmeier, and Watson (2014) further suggested that impaired language competency, such as speech dysfluency or word-finding difficulty, was a primary driving force for use of gestures. Additional evidence echoing the LRH was provided by Chawla and Krauss (1994), who demonstrated that unimpaired speakers used gestures more frequently during spontaneous speech as compared with rehearsed speech th...
Introduction Gestures characterize individuals' nonverbal communicative exchanges, taking on different functions. Several types of research in the neuroscientific field have been interested in the investigation of the neural correlates underlying the observation and implementation of different gestures categories. In particular, different studies have focused on the neural correlates underlying gestures observation, emphasizing the presence of mirroring mechanisms in specific brain areas, which appear to be involved in gesture observation and planning mechanisms. Materials and methods Specifically, the present study aimed to investigate the neural mechanisms, through the use of functional Near‐Infrared Spectroscopy (fNIRS), underlying the observation of affective, social, and informative gestures with positive and negative valence in individuals' dyads composed by encoder and decoder. The variations of oxygenated (O2Hb) and deoxygenated (HHb) hemoglobin concentrations of both individuals were collected simultaneously through the use of hyperscanning paradigm, allowing the recording of brain responsiveness and interbrain connectivity. Results The results showed a different brain activation and an increase of interbrain connectivity according to the type of gestures observed, with a significant increase of O2Hb brain responsiveness and interbrain connectivity and a decrease of HHb brain responsiveness for affective gestures in the dorsolateral prefrontal cortex (DLPFC) and for social gestures in the superior frontal gyrus (SFG). Furthermore, concerning the valence of the observed gestures, an increase of O2Hb brain activity and interbrain connectivity was observed in the left DLPFC for positive affective gestures compared to negative ones. Conclusion In conclusion, the present study showed different brain responses underlying the observation of different types of positive and negative gestures. Moreover, interbrain connectivity calculation allowed us to underline the presence of mirroring mechanisms involved in gesture‐specific frontal regions during gestures observation and action planning.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.