The present study tracked activation pattern differences in response to sign language processing by late hearing second language learners of American Sign Language. Learners were scanned before the start of their language courses. They were scanned again after their first semester of instruction and their second, for a total of 10 months of instruction. The study aimed to characterize modality-specific to modality-general processing throughout the acquisition of sign language. Results indicated that before the acquisition of sign language, neural substrates related to modality-specific processing were present. After approximately 45 h of instruction, the learners transitioned into processing signs on a phonological basis (e.g., supramarginal gyrus, putamen). After one more semester of input, learners transitioned once more to a lexico-semantic processing stage (e.g., left inferior frontal gyrus) at which language control mechanisms (e.g., left caudate, cingulate gyrus) were activated. During these transitional steps right hemispheric recruitment was observed, with increasing left-lateralization, which is similar to other native signers and L2 learners of spoken language; however, specialization for sign language processing with activation in the inferior parietal lobule (i.e., angular gyrus), even for late learners, was observed. As such, the present study is the first to track L2 acquisition of sign language learners in order to characterize modality-independent and modality-specific mechanisms for bilingual language processing.
This study investigated the structure of the bimodal bilingual lexicon. In the cross-modal priming task nonnative sign language learners heard an English word (e.g.,keys) and responded to the lexicality of a signed target: an underlying rhyme (e.g.,cheese) or a sign neighbor of that word (e.g.,paper). The results indicated that rhyme words were retrieved more quickly and the L2 neighbors were faster for beginner learners. An item analysis also indicated that semantics did not facilitate neighbor retrieval and high frequency signs were retrieved more quickly. The AX discrimination task showed that learners focus on handshape and movement parameters and discriminate equally. The interlanguage dynamics play an important role in which phonological parameters are used and the spread of activation over time. A nonselective, integrated model of the bimodal bilingual lexicon is proposed such that lateral connections are weakened over time and handshape parameter feeds most of the activation to neighboring signs as a function of system dynamics.
The present study investigates whether the inferior frontal gyrus is activated for phonetic segmentation of both speech and sign. Early adult second language learners of Spanish and American Sign Language at the very beginning of instruction were tested on their ability to classify lexical items in each language based on their phonetic categories (i.e., initial segments or location parameter, respectively). Conjunction analyses indicated that left-lateralized inferior frontal gyrus (IFG), superior parietal lobule (SPL), and precuneus were activated for both languages. Common activation in the left IFG suggests a modality-independent mechanism for phonetic segmentation. Additionally, common activation in parietal regions suggests spatial preprocessing of audiovisual and manuovisual information for subsequent frontal recoding and mapping. Taken together, we propose that this frontoparietal network is involved in domain-general segmentation of either acoustic or visual signal that is important to novel phonetic segmentation.
In the present study we aimed to investigate phonological substitution errors made by hearing second language (M2L2) learners of American Sign Language (ASL) during a sentence translation task. Learners saw sentences in ASL that were signed by either a native signer or a M2L2 learner. Learners were to simply translate the sentence from ASL to English. Learners' responses were analysed for lexical translation errors that were caused by phonological parameter substitutions. Unlike previous related studies, tracking phonological substitution errors during sentence translation allows for the characterization of uncontrolled and naturalistic perception errors. Results indicated that learners made mostly movement errors followed by handshape and location errors. Learners made more movement errors for sentences signed by the M2L2 learner relative to those by the native signer. Additionally, high proficiency learners made more handshape errors than low proficiency learners. Taken together, this pattern of results suggests that late M2L2 learners are poor at perceiving the movement parameter and M2L2 production variability of the movement parameter negatively contributes to perception.
Understanding how language modality (i.e., signed vs. spoken) affects second language outcomes in hearing adults is important both theoretically and pedagogically, as it can determine the specificity of second language (L2) theory and inform how best to teach a language that uses a new modality. The present study investigated which cognitive-linguistic skills predict successful L2 sign language acquisition. A group (n = 25) of adult hearing L2 learners of American Sign Language underwent a cognitive-linguistic test battery before and after one semester of sign language instruction. A number of cognitive-linguistic measures of verbal memory, phonetic categorization skills, and vocabulary knowledge were examined to determine whether they predicted proficiency in a multiple linear regression analysis. Results indicated that English vocabulary knowledge and phonetic categorization skills predicted both vocabulary growth and self-rated proficiency at the end of one semester of instruction. Memory skills did not significantly predict either proficiency measures. These results highlight how linguistic skills in the first language (L1) directly predict L2 learning outcomes regardless of differences in L1 and L2 language modalities.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.