Research on signed languages offers the opportunity to address many important questions about language that it may not be possible to address via studies of spoken languages alone. Many such studies, however, are inherently limited, because there exist hardly any norms for lexical variables that have appeared to play important roles in spoken language processing. Here, we present a set of norms for age of acquisition, familiarity, and iconicity for 300 British Sign Language (BSL) signs, as rated by deaf signers, in the hope that they may prove useful to other researchers studying BSL and other signed languages. These norms may be downloaded from www.psychonomic.org/archive.
Studies have suggested that language and executive function (EF) are strongly associated. Indeed, the two are difficult to separate, and it is particularly difficult to determine whether one skill is more dependent on the other. Deafness provides a unique opportunity to disentangle these skills because in this case, language difficulties have a sensory not cognitive basis. In this study, deaf (n = 108) and hearing (n = 125) children (age 8 years) were assessed on language and a wide range of nonverbal EF tasks. Deaf children performed significantly less well on EF tasks, even controlling for nonverbal intelligence and speed of processing. Language mediated EF skill, but the reverse pattern was not evident. Findings suggest that language is key to EF performance rather than vice versa.
Several recent studies have suggested that deaf children perform more poorly on working memory tasks compared to hearing children, but these studies have not been able to determine whether this poorer performance arises directly from deafness itself or from deaf children's reduced language exposure. The issue remains unresolved because findings come mostly from (1) tasks that are verbal as opposed to non-verbal, and (2) involve deaf children who use spoken communication and therefore may have experienced impoverished input and delayed language acquisition. This is in contrast to deaf children who have been exposed to a sign language since birth from Deaf parents (and who therefore have native language-learning opportunities within a normal developmental timeframe for language acquisition). A more direct, and therefore stronger, test of the hypothesis that the type and quality of language exposure impact working memory is to use measures of non-verbal working memory (NVWM) and to compare hearing children with two groups of deaf signing children: those who have had native exposure to a sign language, and those who have experienced delayed acquisition and reduced quality of language input compared to their native-signing peers. In this study we investigated the relationship between NVWM and language in three groups aged 6–11 years: hearing children (n = 28), deaf children who were native users of British Sign Language (BSL; n = 8), and deaf children who used BSL but who were not native signers (n = 19). We administered a battery of non-verbal reasoning, NVWM, and language tasks. We examined whether the groups differed on NVWM scores, and whether scores on language tasks predicted scores on NVWM tasks. For the two executive-loaded NVWM tasks included in our battery, the non-native signers performed less accurately than the native signer and hearing groups (who did not differ from one another). Multiple regression analysis revealed that scores on the vocabulary measure predicted scores on those two executive-loaded NVWM tasks (with age and non-verbal reasoning partialled out). Our results suggest that whatever the language modality—spoken or signed—rich language experience from birth, and the good language skills that result from this early age of acquisition, play a critical role in the development of NVWM and in performance on NVWM tasks.
Linguists have suggested that non-manual and manual markers are used in sign languages to indicate prosodic and syntactic boundaries. However, little is known about how native signers interpret non-manual and manual cues with respect to sentence boundaries. Six native signers of British Sign Language (BSL) were asked to mark sentence boundaries in two narratives: one presented in BSL and one in Swedish Sign Language (SSL). For comparative analysis, non-signers undertook the same tasks. Results indicated that both native signers and non-signers were able to use visual cues effectively in segmentation and that their decisions were not dependent on knowledge of the signed language. Signed narratives contain visible cues to their prosodic structure which are available to signers and non-signers alike.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.