1998
DOI: 10.1007/bfb0095051
|View full text |Cite
|
Sign up to set email alerts
|

Computation in recurrent neural networks: From counters to iterated function systems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
8
0

Year Published

1999
1999
2020
2020

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 8 publications
(8 citation statements)
references
References 5 publications
0
8
0
Order By: Relevance
“…A range of studies has experimentally examined the ability of recurrent networks to model counter languages such as a n b n (Kalinke and Lehmann, 1998;Gers and Schmidhuber, 2001;Cartling, 2008;Weiss et al, 2018;Suzgun et al, 2019). Other work has experimentally studied the performance of recurent architectures on learning to recognize well-bracketed strings, a similar but more challenging problem (Sennhauser and Berwick, 2018;Skachkova et al, 2018;Bernardy, 2018).…”
Section: Investigating the Power Of Sequence Modelingmentioning
confidence: 99%
See 1 more Smart Citation
“…A range of studies has experimentally examined the ability of recurrent networks to model counter languages such as a n b n (Kalinke and Lehmann, 1998;Gers and Schmidhuber, 2001;Cartling, 2008;Weiss et al, 2018;Suzgun et al, 2019). Other work has experimentally studied the performance of recurent architectures on learning to recognize well-bracketed strings, a similar but more challenging problem (Sennhauser and Berwick, 2018;Skachkova et al, 2018;Bernardy, 2018).…”
Section: Investigating the Power Of Sequence Modelingmentioning
confidence: 99%
“…Hierarchical structure is widely thought to be essential to modeling natural language, in particular its syntax (Everaert et al, 2015). Consequently, many researchers have studied the capability of recurrent neural network models to capture context-free languages (e.g., Kalinke and Lehmann, 1998;Gers and Schmidhuber, 2001;Grüning, 2006;Weiss et al, 2018;Sennhauser and Berwick, 2018;Korsky and Berwick, 2019) and linguistic phenomena involving hierarchical structure (e.g., Linzen et al, 2016;Gulordava et al, 2018). Some experimental evidence suggests that transformers might not be as strong as LSTMs at modeling hierarchical structure (Tran et al, 2018), though analysis studies have shown that transformer-based models encode a good amount of syntactic knowledge (e.g., Clark et al, 2019;Lin et al, 2019;Tenney et al, 2019).…”
Section: Introductionmentioning
confidence: 99%
“…Linzen et al (2016) study the application of LSTMs to certain grammatical phenomena. RNNs and their variants have been used for recognizing Dyck words (Kalinke and Lehmann, 1998;Deleu and Dureau, 2016). Li et al (2017) evaluate their nonlinear weighted finite automata model on a Dyck language.…”
Section: Related Workmentioning
confidence: 99%
“…Previous work showed that LSTM outperforms traditional RNN algorithms on numerous tasks involving real-valued or discrete inputs and targets [5], [7], including tasks that require to learn the rules of regular languages (RLs) describable by deterministic finite-state automata (DFA) [1], [2], [8], [21], [31]. Until now, however, it has remained unclear whether LSTMs superiority carries over to tasks involving context-free languages (CFLs), such as those discussed in the RNN literature [16], [17], [23]- [25], [28].…”
Section: Introductionmentioning
confidence: 99%