2016
DOI: 10.1007/s00521-016-2698-5
|View full text |Cite
|
Sign up to set email alerts
|

Recursive Bayesian echo state network with an adaptive inflation factor for temperature prediction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
4
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(4 citation statements)
references
References 13 publications
0
4
0
Order By: Relevance
“…Moreover, in dynamic data form, the reliability of former data also needs to be considered. In [198], the authors presented a recursive Bayesian linear regression ESN (RBLR-ESN) to change the confidence level of the prior data.…”
Section: Abnormal Data Orientatedmentioning
confidence: 99%
See 1 more Smart Citation
“…Moreover, in dynamic data form, the reliability of former data also needs to be considered. In [198], the authors presented a recursive Bayesian linear regression ESN (RBLR-ESN) to change the confidence level of the prior data.…”
Section: Abnormal Data Orientatedmentioning
confidence: 99%
“…Many works tested their models on this task [263]. Besides, there are also many ESN-based models designed for solar irradiance prediction [264,265,266,267,268], metereological forecasting [261,111,269], temperature prediction [198], water/stream flow forecasting [270,271,163] and wind speed Forecasting [272,131,273] • Financial applications.…”
Section: Real-world Tasks Orientatedmentioning
confidence: 99%
“…Although the neural network has good self-learning ability and nonlinearity in prediction, it also has some problems such as easy to fall into local optimum, slow convergence speed, and easy to oscillate, and the number of units in the hidden layer is difficult to determine. Huang et al [14] proposed a recursive Bayesian state algorithm based on an echo state network model to predict temperature time series. As a new type of recurrent network, the echo state network overcomes the problem that traditional recurrent network is easy to fall into local optimum and its training algorithm is complex, but the random generation reservoir in the echo state network is not related to the specific issues and the parameters are difficult to determine.…”
Section: Introductionmentioning
confidence: 99%
“…The first group of optimizing approaches vary the details of original ESN, such as varying gain and bias of activation function [1], replacing normalized root mean square error (NRMSE) by correntropy [2,3] and adding Laplacian algorithm [4] to reduce the dimension of reservoir states. The second group of optimizing approaches mainly combine the ESN with other complementary neural networks or methods, including but not limited to echo state queueing networks (ESQN) [5], robust echo state networks (RESN) [2], combination structure of reservoir computing and support vector machines (RCSVM) [6], deep belief echo state network (DBESN) [7], deep recurrent neural network (DRNN) [8], optimized echo state network (OESN) by binary particle swarm optimal algorithm [9], stacked denoising autoencoders (SDA) [10], dynamical regularized echo state network (DRESN) [11], adaptive lasso echo state network (ALESN) [12], recursive Bayesian echo state network (RBESN) [13], polynomial echo state network [14], multilayered echo state network (MLESN) [15] and stacked deep echo state network (DESN) [16].…”
Section: Introductionmentioning
confidence: 99%