Linguistic relativity theory has received empirical support in domains such as color perception and object categorization. It is unknown, however, whether relations between words idiosyncratic to language impact non-verbal representations and conceptualizations. For instance, would one consider the concepts of horse and sea as related were it not for the existence of the compound seahorse? Here, we investigated such arbitrary conceptual relationships using a non-linguistic picture relatedness task in participants undergoing event-related brain potential recordings. Picture pairs arbitrarily related because of a compound and presented in the compound order elicited N400 amplitudes similar to unrelated pairs. Surprisingly, however, pictures presented in the reverse order (as in the sequence horse–sea) reduced N400 amplitudes significantly, demonstrating the existence of a link in memory between these two concepts otherwise unrelated. These results break new ground in the domain of linguistic relativity by revealing predicted semantic associations driven by lexical relations intrinsic to language.
Recent psycholinguistic research demonstrates that using a second language has consequences for the first language (e.g. Dussias, 2003; Van Hell & Dijkstra, 2002) and for domain-general cognitive processes (Bialystok, 2005). This work suggests that the language system is permeable, with cross-language exchange at every level of processing (Malt & Sloman, 2003). Critically, even proficient bilinguals appear unable to switch off the language not in use when they hear, read, or speak one language alone (e.g. Dijkstra, 2005; Kroll, Bobb, & Wodniecka, 2006; Marian & Spivey, 2003), creating cross-language competition. In this article, we describe research that considers how cross-language activation is modulated during spoken production and during the earliest stages of second language learning. We hypothesize that the open nature of the bilingual’s language system may create optimal conditions for new language learning and also for enhanced cognitive control that enables effective selection of the language to be spoken.
Interactive models of language production predict that it should be possible to observe long-distance interactions; effects that arise at one level of processing influence multiple subsequent stages of representation and processing. We examine the hypothesis that disruptions arising in nonform-based levels of planning-specifically, lexical selection-should modulate articulatory processing. A novel automatic phonetic analysis method was used to examine productions in a paradigm yielding both general disruptions to formulation processes and, more specifically, overt errors during lexical selection. This analysis method allowed us to examine articulatory disruptions at multiple levels of analysis, from whole words to individual segments. Baseline performance by young adults was contrasted with young speakers' performance under time pressure (which previous work has argued increases interaction between planning and articulation) and performance by older adults (who may have difficulties inhibiting nontarget representations, leading to heightened interactive effects). The results revealed the presence of interactive effects. Our new analysis techniques revealed these effects were strongest in initial portions of responses, suggesting that speech is initiated as soon as the first segment has been planned. Interactive effects did not increase under response pressure, suggesting interaction between planning and articulation is relatively fixed. Unexpectedly, lexical selection disruptions appeared to yield some degree of facilitation in articulatory processing (possibly reflecting semantic facilitation of target retrieval) and older adults showed weaker, not stronger interactive effects (possibly reflecting weakened connections between lexical and form-level representations). (PsycINFO Database Record
In the last 25 years, cognitive scientists have come to see that using more than one language is a natural circumstance of human experience, not an exceptional condition that produces disordered speaking or thinking (1). The revision of previously held negative views about bilingualism is largely attributable to the finding that although bilinguals appear to activate both languages when reading, listening to speech, and speaking one language alone, they do not suffer notable disruptions (2). Bilingual speakers can choose the language they wish to speak, but at the same, switch back and forth from one language to the other, often in midsentence, with others who are similarly bilingual. The evidence suggests that the consequences of bilingualism are largely positive, with features of bilingual minds and brains reflecting the benefits of a life spent negotiating the presence of two languages and acquiring the skill to select the appropriate language in the intended context (3, 4). In a fascinating PNAS report, Zhang et al. (5) demonstrate that bilingual speakers are sensitive to cultural cues in the environment that signal the presence of the native language. Contrary to the claim that bilingualism confers benefits to those who speak more than one language, the form of the sensitivity demonstrated in the Zhang et al. research is negative, with slower speech when speaking the second language, L2, in the presence of first language, L1. Most dramatically, these disruptions to L2 speech occur when bilinguals are speaking to the faces of other bilinguals with the same L1.Zhang et al. (5) show that when ChineseEnglish bilinguals speak to Chinese faces, they speak English, their L2, more hesitantly than when they speak to Caucasian faces. Critically, the experiments that generated these data were conducted in the United States with bilinguals immersed in American culture and in English as the dominant language of the environment. The fact that their English changes when speaking to a Chinese vs. Caucasian face suggests that native language cues and culture are powerful determinants of language performance. Why faces? For infants during early stages of language development, the face is the primary source of information about the connection between language and the social environment (6). For infants exposed to two languages from birth, faces may be a special cue to which language is being spoken. A remarkable study by Sebastián-Gallés et al. (7) showed that crib bilinguals can discriminate which of two languages is being spoken on the basis of a silent video of speaking faces,Zhang et al. demonstrate that bilingual speakers are sensitive to cultural cues in the environment that signal the presence of the native language.even when the languages spoken are not the languages to which the infants have been exposed. Babies exposed to only one language are not able to make the discrimination, suggesting that, like the results of the Zhang et al. study (5), it is something about bilingualism itself and not simply dependence on a ...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.