2010
DOI: 10.1016/j.neunet.2009.08.008
|View full text |Cite
|
Sign up to set email alerts
|

Memory in linear recurrent neural networks in continuous time

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
88
1
1

Year Published

2010
2010
2017
2017

Publication Types

Select...
5
3
2

Relationship

2
8

Authors

Journals

citations
Cited by 96 publications
(90 citation statements)
references
References 20 publications
0
88
1
1
Order By: Relevance
“…Ref. [10] gave closed-form expressions for memory function of continuous-time linear recurrent networks in terms of the autocorrelation function of the input, and closely studied the case of an exponential autocorrelation function. * semarzen@mit.edu Ref.…”
mentioning
confidence: 99%
“…Ref. [10] gave closed-form expressions for memory function of continuous-time linear recurrent networks in terms of the autocorrelation function of the input, and closely studied the case of an exponential autocorrelation function. * semarzen@mit.edu Ref.…”
mentioning
confidence: 99%
“…In this paper we will research memory properties for these kinds of input signals. Past research has focused on onedimensional signals to quite some detail [9], [10], [11], [12]. The main conclusions found in this area of study are the following.…”
Section: A Memory In Recurrent Networkmentioning
confidence: 84%
“…Therefore, the type of memory used in LR and ELM does not add dynamics, but merely serves as a noise filter, opposed to the dynamic properties in a reservoir. Consequently, RC has memory capacity [16] as RC systems can learn relations with the past, while ELM has no memory capacity due to the lack of recurrent connections between the hidden-layer nodes.…”
Section: Reservoir Computingmentioning
confidence: 99%