2016
DOI: 10.48550/arxiv.1605.08346
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Distributed Sequence Memory of Multidimensional Inputs in Recurrent Networks

Adam Charles,
Dong Yin,
Christopher Rozell

Abstract: Recurrent neural networks (RNNs) have drawn interest from machine learning researchers because of their effectiveness at preserving past inputs for time-varying data processing tasks. To understand the success and limitations of RNNs, it is critical that we advance our analysis of their fundamental memory properties. We focus on echo state networks (ESNs), which are RNNs with simple memoryless nodes and random connectivity. In most existing analyses, the short-term memory (STM) capacity results conclude that t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2017
2017
2017
2017

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 60 publications
0
1
0
Order By: Relevance
“…Nowadays, RNNs are gaining renewed interest in neuroscience due to their biological plausibility [7][8][9][10] and in computer science and engineering for their modeling ability 11,12 . RNNs are capable to generate complex dynamics and perform inference based on current inputs and internal state, the latter maintaining a vanishing memory of past inputs 13,14 .…”
Section: Introductionmentioning
confidence: 99%
“…Nowadays, RNNs are gaining renewed interest in neuroscience due to their biological plausibility [7][8][9][10] and in computer science and engineering for their modeling ability 11,12 . RNNs are capable to generate complex dynamics and perform inference based on current inputs and internal state, the latter maintaining a vanishing memory of past inputs 13,14 .…”
Section: Introductionmentioning
confidence: 99%