2020 International Joint Conference on Neural Networks (IJCNN) 2020
DOI: 10.1109/ijcnn48605.2020.9206786
|View full text |Cite
|
Sign up to set email alerts
|

A Gated Recurrent Unit based Echo State Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 23 publications
0
5
0
Order By: Relevance
“…ReservoirComputing.jl provides a simple interface for model building that closely follows the workflow presented in the corresponding literature. It includes a standard implementation of ESNs [Lukoševičius, 2012] as well as a hybrid variation [Pathak et al, 2018b], gated recurrent unit ESN [Wang et al, 2020] and double activation function ESN [Lun et al, 2015]. Multiple input layers χ and reservoirs R are also provided, ranging from weighted input layers [Lu et al, 2017] to minimally complex input layers and reservoirs Tino, 2010, Rodan andTiňo, 2012], including a reservoir obtained through pseudo single value decomposition [Yang et al, 2018].…”
Section: Library Overviewmentioning
confidence: 99%
“…ReservoirComputing.jl provides a simple interface for model building that closely follows the workflow presented in the corresponding literature. It includes a standard implementation of ESNs [Lukoševičius, 2012] as well as a hybrid variation [Pathak et al, 2018b], gated recurrent unit ESN [Wang et al, 2020] and double activation function ESN [Lun et al, 2015]. Multiple input layers χ and reservoirs R are also provided, ranging from weighted input layers [Lu et al, 2017] to minimally complex input layers and reservoirs Tino, 2010, Rodan andTiňo, 2012], including a reservoir obtained through pseudo single value decomposition [Yang et al, 2018].…”
Section: Library Overviewmentioning
confidence: 99%
“…The process of extending the architecture of ESNs with gating mechanisms has been first investigated in our previous work [9] and, concurrently and independently, in [32]. The two works share a similar idea, which consists of extending the state transition function of an ESN to include the same gating mechanisms of a GRU, while keeping all the parameters in the gates untrained.…”
Section: Related Workmentioning
confidence: 99%
“…While the underlying idea from [9] and [32] is similar, in [9] and in the current work we also take care to (1) consider the fundamental details regarding the initialization strategy, (2) perform an experimental evaluation over a dataset that makes it possible to evaluate more clearly the actual influence of the gates, and (3) perform a more extensive evaluation of the competing reservoir computing models, which led to important insights about the feasibility of the approach. Moreover, the current paper further extends our previous work [9] to include (a) the development of a theoretical analysis of the state dynamics in the proposed models, (b) the analysis and discussion of the agreement between the theoretical results and the experimental measurements, (c) the expansion of the hyperparameter search for the experiments, (d) the collection and discussion of additional measurements for the activation of the gates, and (e) the reporting and discussion of additional measurements regarding the weight matrices.…”
Section: Related Workmentioning
confidence: 99%
“…Gated recurrent unit (GRU) [133] is another enhanced version of RNN. In [134,135], authors presented GRU-based ESNs to tackle complex real-world tasks while reducing the computational costs. The reservoir unit was replaced by the sparsely connected GRU neurons.…”
Section: Combinations Of Esn and Deep Learningmentioning
confidence: 99%
“…To test ESN, most existing works set s = 5 and α 1 = 0.2, α 2 = 0.311, α 3 = 0.42, α 4 = 0.51, α 5 = 0.63. [43,134] • Lorenz system.…”
Section: Benchmark Datasetsmentioning
confidence: 99%