The 2010 International Joint Conference on Neural Networks (IJCNN) 2010
DOI: 10.1109/ijcnn.2010.5596492
|View full text |Cite
|
Sign up to set email alerts
|

Memory versus non-linearity in reservoirs

Abstract: Reservoir Computing (RC) is increasingly being used as a conceptually simple yet powerful method for using the temporal processing of recurrent neural networks (RNN). However, because fundamental insight in the exact functionality of the reservoir is as yet still lacking, in practice there is still a lot of manual parameter tweaking or brute-force searching involved in optimizing these systems. In this contribution we aim to enhance the insights into reservoir operation, by experimentally studying the interpla… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
53
0
1

Year Published

2013
2013
2024
2024

Publication Types

Select...
4
4
1

Relationship

2
7

Authors

Journals

citations
Cited by 78 publications
(55 citation statements)
references
References 14 publications
1
53
0
1
Order By: Relevance
“…The setting of parameters is not very critical, and was chosen to give best generalization performance. Note that a decrease in the input scaling counteracts an increase in the spectral radius to avoid a possible loss of memory capacity and of performance [37].…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The setting of parameters is not very critical, and was chosen to give best generalization performance. Note that a decrease in the input scaling counteracts an increase in the spectral radius to avoid a possible loss of memory capacity and of performance [37].…”
Section: Methodsmentioning
confidence: 99%
“…Although it does not guarantee the ESP, in practice it has been empirically observed that this criterium works well and often produces analog sigmoid ESNs with ESP for any input. It is important to note that spectral radius closer to unity as well as larger input scaling makes the reservoir more non-linear, which has a deterioration impact on the memory capacity as side-effect [37]. This scaling of matrices is important because it influences greatly the reservoir dynamics [38] and, in this way, must be chosen according to the task at hand empirically, analyzing the behavior of the reservoir states over time, or by grid searching.…”
Section: Esn Modelmentioning
confidence: 99%
“…Considering a normalized input signal u[n], the effect of input scaling υ r i on the reservoir dynamics is such that, the larger the scaling, the closer to saturation the reservoir states will be, since the reservoir state is shifted torwards the nonlinear area of the tanh activation function. Spectral radius closer to unity as well as larger input scaling makes the reservoir more non-linear, which has a deterioration impact on the memory capacity as side-effect [38].…”
Section: Methodsmentioning
confidence: 99%
“…It is important to note that spectral radius closer to unity as well as larger input scaling makes the reservoir more non-linear, which has a deterioration impact on the memory capacity as side-effect [13]. The scaling of these nontrainable weights is a parameter which should be chosen according to the task at hand empirically, analyzing the behavior of the reservoir state over time, or by grid searching over parameter ranges.…”
Section: Esn Modelmentioning
confidence: 99%