2021
DOI: 10.3389/fams.2020.616658
|View full text |Cite
|
Sign up to set email alerts
|

Exploiting Multiple Timescales in Hierarchical Echo State Networks

Abstract: Echo state networks (ESNs) are a powerful form of reservoir computing that only require training of linear output weights while the internal reservoir is formed of fixed randomly connected neurons. With a correctly scaled connectivity matrix, the neurons’ activity exhibits the echo-state property and responds to the input dynamics with certain timescales. Tuning the timescales of the network can be necessary for treating certain tasks, and some environments require multiple timescales for an efficient represen… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

3
25
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
2

Relationship

3
6

Authors

Journals

citations
Cited by 30 publications
(28 citation statements)
references
References 42 publications
3
25
0
Order By: Relevance
“…Timescales have been implicit in the reinforcement learning framework, in the context of propagating information about the success (or failure) of the task in cases where reward is not immediate, see for instance eligibility traces [43,59]. This is not entirely the same as the concept of time scales in this model, where the emphasis is on acquiring and retaining sensory information from the environment, not unlike what happens in the field of Reservoir Computing [60].…”
Section: Discussionmentioning
confidence: 99%
“…Timescales have been implicit in the reinforcement learning framework, in the context of propagating information about the success (or failure) of the task in cases where reward is not immediate, see for instance eligibility traces [43,59]. This is not entirely the same as the concept of time scales in this model, where the emphasis is on acquiring and retaining sensory information from the environment, not unlike what happens in the field of Reservoir Computing [60].…”
Section: Discussionmentioning
confidence: 99%
“…The complexity of the maps, while not discussed in further detail here, may be due in part to coupling between input strength and internal timescale (see supplementary). The goal of the chaotic time series task (NARMA10) is to predict the response to white noise (random numbers drawn from [0, 0.5]) of a discrete-time tenth order nonlinear auto-regressive moving average 39,40 :…”
Section: (E)mentioning
confidence: 99%
“…Yet even in populations that ostensibly encode the same kind of stimulus, like olfactory mitral cells, heterogeneity of neuronal excitability can increase the information content of their population activity [15][16][17]. In addition, heterogeneity in neuronal time scales can improve learning in neural networks [18,19]. In what contexts and in what senses might the opposite be true, i.e., when does neuronal similarity provide computational benefits over neuronal diversity?…”
Section: Introductionmentioning
confidence: 99%