We studied if and when people detect the beginning of a gesture, in our case a sign in Sign Language of the Netherlands (SLN), by presenting movie fragments consisting of sequences of rest positions, fidgets, and signs to deaf signers, hearing signers and non-signers. Participants were instructed to respond as soon as they saw that a SLN sign had begun. All participants showed themselves highly capable of responding to sign beginnings. Signs that are two-handed, performed in signing space, have a highly marked hand shape, and contain path movement were discriminated best. Considering a sign as having a preparation, a stroke, and a recovery, response times showed strong clusters around 500 milliseconds after the beginning of sign preparation, or 200 ms after the onset of the stroke. The non-signers needed more time before responding; deaf signers took more time than hearing signers. Response time was influenced by three factors (shorter for signs that have a highly marked hand shape, are one-handed, and are preceded by fidgets). The results show that it is possible for people to discriminate fidgeting and signs based on appearance, even if one does not know sign language. No single feature of the movement appears necessary to detect the beginning of a sign. In most cases visual information available up to an early stage of the stroke is sufficient but in some cases the information in the preparation is enough.
The aim of this paper is to examine when signers start to recognize the lexical meaning of a sign. This is studied with movies of 32 mono-morphemic signs of Sign Language of the Netherlands (SLN). Signs were presented in isolation or with preceding fidgets (e.g., rubbing your nose). Signers watched these movies at normal playing speed and had to respond as soon as they recognized a sign, which they were able to do, on average, about 850 ms after the coded beginning of the sign. By subtracting the time participants need to generate a motor response to a visible event, which was 310 ms on average, sign recognition was estimated to occur after around 540 ms. The results were further analyzed in relation to the sign's movement phases (preparation, nucleus, and recovery) and for effects of participant characteristics, sign characteristics, and embedding conditions. The current findings are compared with earlier work on the time course of lexical sign recognition. Moreover, they are compared with findings from an earlier experiment on detecting the beginning of a sign (Arendsen et al., 2007) to study possible interference of lexical recognition with sign detection by signers.
In this study we compare human and machine acceptability judgments for extreme variations in sign productions. We gathered acceptability judgments of 26 signers and scores of three different Automatic Gesture Recognition (AGR) algorithms that could potentially be used for automatic acceptability judgments, in which case the correlation between human ratings and AGR scores may serve as an 'acceptability performance' measure. We found high human-human correlations, high AGR-AGR correlations, but low human-AGR correlations. Furthermore, in a comparison between acceptability and classification performance of the different AGR methods, classification performance was found to be an unreliable predictor of acceptability performance.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.