2017
DOI: 10.1111/cogs.12519
|View full text |Cite
|
Sign up to set email alerts
|

Input and Age‐Dependent Variation in Second Language Learning: A Connectionist Account

Abstract: Language learning requires linguistic input, but several studies have found that knowledge of second language (L2) rules does not seem to improve with more language exposure (e.g., Johnson & Newport, 1989). One reason for this is that previous studies did not factor out variation due to the different rules tested. To examine this issue, we reanalyzed grammaticality judgment scores in Flege, Yeni‐Komshian, and Liu's (1999) study of L2 learners using rule‐related predictors and found that, in addition to the ove… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
23
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 24 publications
(32 citation statements)
references
References 124 publications
1
23
0
Order By: Relevance
“…In most of these models, turning off the computation of summary signals of mean activation or activation update disrupts the model's ability to explain ERPs, but no other functions are affected (e.g., comprehension of meaning can still take place). In the Error Propagation account, turning off the error computation that supports ERPs also prevents the model from learning, which means that it can no longer explain adaptation effects like structural priming (Chang et al, 2006) or second language acquisition in adults (Janciauskas & Chang, 2018). In contrast, deactivating the computation of the Semantic Update in the model of Rabovsky et al (2018) would not affect any of the comprehension results obtained with the Sentence Gestalt model in St.…”
Section: Alternative Accounts Of Erpsmentioning
confidence: 99%
See 2 more Smart Citations
“…In most of these models, turning off the computation of summary signals of mean activation or activation update disrupts the model's ability to explain ERPs, but no other functions are affected (e.g., comprehension of meaning can still take place). In the Error Propagation account, turning off the error computation that supports ERPs also prevents the model from learning, which means that it can no longer explain adaptation effects like structural priming (Chang et al, 2006) or second language acquisition in adults (Janciauskas & Chang, 2018). In contrast, deactivating the computation of the Semantic Update in the model of Rabovsky et al (2018) would not affect any of the comprehension results obtained with the Sentence Gestalt model in St.…”
Section: Alternative Accounts Of Erpsmentioning
confidence: 99%
“…This complex architecture is motivated by the need to explain a range of different behaviors from production (structural priming, Chang et al, 2006; heavy NP shift, word order effects, Chang, 2009;aphasia, Chang, 2002) and acquisition studies (auxiliary inversion, Fitz & Chang, 2017; acquisition of verb classes, Twomey et al, 2014;accessibility hierarchy, Fitz et al, 2011). It has also been applied to learn syntactic constraints from different languages (German, Chang, Baumann, Pappert, & Fitz, 2015;Japanese, Chang, 2009; English-learning Korean speakers, Janciauskas & Chang, 2018). In this work, we are testing whether ERP effects in comprehension can arise out of an architecture that was designed originally for production and acquisition.…”
Section: A Connectionist Model Of Event-related Potentialsmentioning
confidence: 99%
See 1 more Smart Citation
“…It requires no architectural adaptations, apart from the minimal addition of language-control units that steer the network towards producing a sentence in one of the two languages. Janciauskas and Chang (2018) used the bilingual Dual-path model to study the effects of L2 age of acquisition (AoA) and length of L2 exposure. They first trained the model on Korean-like sentences.…”
Section: Bilingual Rnn Sentence Comprehensionmentioning
confidence: 99%
“…Several NLP applications have emerged, e.g., to detect code-switches (Solorio and Liu, 2008;Guzmán et al, 2017), or to automatically recognize code-switched speech (Yılmaz et al, 2016;Gonen and Goldberg, 2018). Moreover, there are a small number of cognitive computational models relevant to code-switching: Filippi et al (2014) developed a model of code-switched word production and Janciauskas and Chang (2018), while simulating age of acquisition effects on native Korean speakers of English, reported that the models that had been exposed to English later produced code-switches, i.e., occasionally used Korean words in their predominantly English production.…”
Section: Introductionmentioning
confidence: 99%