2017
DOI: 10.48550/arxiv.1705.05940
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Subregular Complexity and Deep Learning

Enes Avcu,
Chihiro Shibata,
Jeffrey Heinz

Abstract: This paper argues that the judicial use of formal language theory and grammatical inference are invaluable tools in understanding how deep neural networks can and cannot represent and learn long-term dependencies in temporal sequences.Learning experiments were conducted with two types of Recurrent Neural Networks (RNNs) on six formal languages drawn from the Strictly Local (SL) and Strictly Piecewise (SP) classes.The networks were Simple RNNs (s-RNNs) and Long Short-Term Memory RNNs (LSTMs) of varying sizes. T… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2017
2017
2017
2017

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 23 publications
0
1
0
Order By: Relevance
“…The ability of RNN to learn classes of formal languages has also been investigated, see e.g. [Avcu et al, 2017] and references therein. It is well know that predictive state representations (PSR) [Littman and Sutton, 2002] are strongly related with WFA [Thon and Jaeger, 2015].…”
Section: Introductionmentioning
confidence: 99%
“…The ability of RNN to learn classes of formal languages has also been investigated, see e.g. [Avcu et al, 2017] and references therein. It is well know that predictive state representations (PSR) [Littman and Sutton, 2002] are strongly related with WFA [Thon and Jaeger, 2015].…”
Section: Introductionmentioning
confidence: 99%