The primary goal of this work is to examine prosodic structure as expressed concurrently through articulatory and manual gestures. Specifically, we investigated the effects of phrase-level prominence (Experiment 1) and of prosodic boundaries (Experiments 2 and 3) on the kinematic properties of oral constriction and manual gestures. The hypothesis guiding this work is that prosodic structure will be similarly expressed in both modalities. To test this, we have developed a novel method of data collection that simultaneously records speech audio, vocal tract gestures (using electromagnetic articulometry) and manual gestures (using motion capture). This method allows us, for the first time, to investigate kinematic properties of body movement and vocal tract gestures simultaneously, which in turn allows us to examine the relationship between speech and body gestures with great precision. A second goal of the paper is thus to establish the validity of this method. Results from two speakers show that manual and oral gestures lengthen under prominence and at prosodic boundaries, indicating that the effects of prosodic structure extend beyond the vocal tract to include body movement.1
This study examines sign lowering as a form of phonetic reduction in American Sign Language. Phonetic reduction occurs in the course of normal language production, when instead of producing a carefully articulated form of a word, the language user produces a less clearly articulated form. When signs are produced in context by native signers, they often differ from the citation forms of signs. In some cases, phonetic reduction is manifested as a sign being produced at a lower location than in the citation form. Sign lowering has been documented previously, but this is the first study to examine it in phonetic detail. The data presented here are tokens of the sign WONDER, as produced by six native signers, in two phonetic contexts and at three signing rates, which were captured by optoelectronic motion capture. The results indicate that sign lowering occurred for all signers, according to the factors we manipulated. Sign production was affected by several phonetic factors that also influence speech production, namely, production rate, phonetic context, and position within an utterance. In addition, we have discovered interesting variations in sign production, which could underlie distinctions in signing style, analogous to accent or voice quality in speech. KeywordsAmerican Sign Language; lowering; phonetic reduction; motion capture; sign production IntroductionStudies of phonetics and phonology in signed languages can illustrate commonalities and differences between sign and speech. Signed languages are natural languages used by Deaf 1 communities around the world. This study focuses on American Sign Language, which is used by Deaf people in the United States and Canada. Research has shown that ASL and other signed languages are organized similarly to spoken languages, i.e., they have semantic, syntactic, morphological and phonological systems (Klima & Bellugi, 1979,Sandler & Lillo-Martin, 2006. The terms phonology and phonetics are used in sign language research to describe the sign modality's analogs of the phonological and phonetic aspects of spoken languages. More Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain. 1 In this paper, we will use the term Deaf to describe members of the community of sign language users, as distinct from deaf, which refers to clinical hearing loss. specifically, sign language phonetics is the study of the physical transmission of ideas through the manual-visual channel by the movement of the arms, hands and fingers. The basic phonological parameters of signs are movement, handshape and location (Stokoe, 1960). Minimal pairs in a signed...
This study explores the coordination between manual pointing gestures and gestures of the vocal tract. Using a novel methodology that allows for concurrent collection of audio, kinematic body and speech articulator trajectories, we ask 1) which particular gesture (vowel gesture, consonant gesture, or tone gesture) the pointing gesture is coordinated with, and 2) with which landmarks the two gestures are coordinated (for example, whether the pointing gesture is coordinated to the speech gesture by the onset or maximum displacement). Preliminary results indicate coordination of the intonation gesture and the pointing gesture.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.