2009
DOI: 10.1016/j.tcs.2009.01.015
|View full text |Cite
|
Sign up to set email alerts
|

Parallelism increases iterative learning power

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2010
2010
2021
2021

Publication Types

Select...
4
1
1

Relationship

2
4

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 15 publications
0
5
0
Order By: Relevance
“…The results of this query reveal 1. the importance of order in LDP and the relative unimportance of distance, 2. that a debate about the nature of locality in phonology is also a debate about computational complexity in phonology, and 3. that long-distance dissimilation fundamentally differs from long-distance assimilation with respect to the kinds of phonotactic patterns they derive. 1 This resembles what formal learning theorists call parallel learning (Case and Moelius 2007). 2 Within OT, there are particular learning proposals that fail to learn the factorial typology and in this way constrain the predicted possible languages (Tesar andSmolensky 2000, Boersma 2003).…”
Section: Outlinementioning
confidence: 88%
“…The results of this query reveal 1. the importance of order in LDP and the relative unimportance of distance, 2. that a debate about the nature of locality in phonology is also a debate about computational complexity in phonology, and 3. that long-distance dissimilation fundamentally differs from long-distance assimilation with respect to the kinds of phonotactic patterns they derive. 1 This resembles what formal learning theorists call parallel learning (Case and Moelius 2007). 2 Within OT, there are particular learning proposals that fail to learn the factorial typology and in this way constrain the predicted possible languages (Tesar andSmolensky 2000, Boersma 2003).…”
Section: Outlinementioning
confidence: 88%
“…rRInf Exs " rInf Exs and rRTxtExs " rTxtExs. On the other hand [CM09] showed rRItTxtExs Ĺ rItTxtExs.…”
Section: Iterative Learning From Informantmentioning
confidence: 94%
“…For understanding iterative learners we analyze what normal forms can be assumed about such learners in Section 2.2. First we show that, analogously to the case of learning from text (as analyzed in [CM09]), we cannot assume learners to be total (i.e. always giving an output).…”
Section: Introductionmentioning
confidence: 98%
“…An iterative learner computes its guesses about the target language based on its own most recent conjecture and on the current data item only . Iterative Learning is a well‐studied model (Becerra‐Bonache, Case, Jain, & Stephan, ; Case & Moelius, ; Case, Jain, Lange, & Zeugmann, ; Jain & Kinber, ; Jain, Lange, & Zilles, ; Lange & Grieser, ). Most interestingly, in the perspective of the present study, Iterative Learning is the base case of a hierarchy of stronger and stronger memory‐limited models.…”
Section: U‐shaped Learning With Memory Limitationsmentioning
confidence: 99%