DOI: 10.1007/978-3-540-87536-9_83
|View full text |Cite
|
Sign up to set email alerts
|

Stable Output Feedback in Reservoir Computing Using Ridge Regression

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
31
0
1

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 47 publications
(32 citation statements)
references
References 8 publications
0
31
0
1
Order By: Relevance
“…In recent years, a number of applications of ESN in streamflow forecasting [7][8][9] for hydropower plant and load forecasting [10][11][12] for power system have been revealed in the literature. The results indicate that ESN not only benefits from some feedbacks like other RNNs that enable them to model any complex dynamic behavior, but also gains a sparsely interconnected reservoir of neurons leading to a very fast and simple training procedure, unlike the complicated and time consuming training process of other RNNs without reservoir.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…In recent years, a number of applications of ESN in streamflow forecasting [7][8][9] for hydropower plant and load forecasting [10][11][12] for power system have been revealed in the literature. The results indicate that ESN not only benefits from some feedbacks like other RNNs that enable them to model any complex dynamic behavior, but also gains a sparsely interconnected reservoir of neurons leading to a very fast and simple training procedure, unlike the complicated and time consuming training process of other RNNs without reservoir.…”
Section: Introductionmentioning
confidence: 99%
“…Although this method could achieve better forecast results, the regularization parameter was hard to determine and the cross-validation process was time-consuming. Wyffels et al [9] utilized the ridge regression algorithm to obtain the optimal output weights, however, it is hard to determine the ridge parameter. The Bayesian theory that is usually used as parameter regularization algorithm to optimize the parameters of forward neural network (FNN), has begun to be employed to optimize the output weights of ENN.…”
Section: Introductionmentioning
confidence: 99%
“…The stability of the generated y[n] output signal is essential for the identification task and can be achieved by using noise injection during training [21] or finding the optimal regularization parameter λ in ridge regression [43]. To test the hypothesis of stability, two experiments were devised using the test data: the first experiment consisted of adding a single large and increasing perturbation during 6 seconds, whereas the second was done by adding Gaussian noise to y[n] at each timestep.…”
Section: Resultsmentioning
confidence: 99%
“…Furthermore, stabilization of the system with output feedback is a concern to be handled. That can be achieved by state noise injection [21] or regularizing the readout output [43].…”
Section: Esn Modelmentioning
confidence: 99%
See 1 more Smart Citation