Language and Recursion 2013
DOI: 10.1007/978-1-4614-9414-0_6
|View full text |Cite
|
Sign up to set email alerts
|

Implicit Learning and Recursion

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2013
2013
2020
2020

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 17 publications
(15 citation statements)
references
References 67 publications
0
15
0
Order By: Relevance
“…We demonstrate in this study that recurrent neural network models are closer to humans than feedforward ones, irrespective of the grammars' level in the Chomsky's hierarchy. This result endorses recurrent models as the more plausible cognitive architecture underlying language, pointing to an essential role of recursion in grammar learning (Rohrmeier, Dienes, Guo, & Fu, 2014). However, it could be fetching to generalize this conclusion to natural language acquisition (Corballis, 2014;Jackendoff, 2011;Fitch & Friederici, 2012;Vergauwen, 2014): the Chomsky hierarchy, despite being an excellent reference, does not embrace all natural grammars (Jäger & Rogers, 2012), and does not fully reflect human cognitive complexity (Öttl et al, 2015).…”
Section: Resultsmentioning
confidence: 96%
“…We demonstrate in this study that recurrent neural network models are closer to humans than feedforward ones, irrespective of the grammars' level in the Chomsky's hierarchy. This result endorses recurrent models as the more plausible cognitive architecture underlying language, pointing to an essential role of recursion in grammar learning (Rohrmeier, Dienes, Guo, & Fu, 2014). However, it could be fetching to generalize this conclusion to natural language acquisition (Corballis, 2014;Jackendoff, 2011;Fitch & Friederici, 2012;Vergauwen, 2014): the Chomsky hierarchy, despite being an excellent reference, does not embrace all natural grammars (Jäger & Rogers, 2012), and does not fully reflect human cognitive complexity (Öttl et al, 2015).…”
Section: Resultsmentioning
confidence: 96%
“…The questions are of course related in that certain structures require learning devices of a certain level of complexity; and a given learning device defines a class of learnable structures. One prominent way of classifying structure complexity that may be relevant to implicit learning is whether the structure could be processed by a device of finite state or rather more than finite state complexity (e.g., Poletiek, 2011 ; Rohrmeier et al, 2014 ).…”
Section: Introductionmentioning
confidence: 99%
“…E, that need to be encoded). Therefore, if there are multiple (finite), potentially nested non-local dependencies the number of required states grows exponentially (see also the comparable argument regarding the implicit acquisition of such structures in references [99,114]). …”
Section: Moving Towards Different Types Of Modelsmentioning
confidence: 99%
“…One aspect stems from the fact that the CH is by its definition fundamentally tied to rewrite rules and the structures that different types of rewrite rules constrained by different restrictions may express. One well-known issue-and an aspect that the notion of mild context-sensitivity addresses-concerns the fact that repetition, repetition under a modification (such as musical transposition) and cross-serial dependencies constitute types of structures that require quite complex rewrite rules (see also the example of context-sensitive rewrite rules expressing cross-serial dependencies in reference [114]). In contrast, such phenomena are frequent forms of formbuilding in music [25] and animal song.…”
Section: Limitations Of the Chomsky Hierarchymentioning
confidence: 99%
See 1 more Smart Citation