2018
DOI: 10.1016/j.ymssp.2018.03.047
|View full text |Cite
|
Sign up to set email alerts
|

Echo state kernel recursive least squares algorithm for machine condition prediction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0
1

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 22 publications
(9 citation statements)
references
References 46 publications
0
8
0
1
Order By: Relevance
“…is the regularization factor. In order to keep the size of kernel matrix unchanged, the kernel matrix uses the new sample data to add new rows and new columns by ( 21)- (23).…”
Section: B Sliding Window Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…is the regularization factor. In order to keep the size of kernel matrix unchanged, the kernel matrix uses the new sample data to add new rows and new columns by ( 21)- (23).…”
Section: B Sliding Window Methodsmentioning
confidence: 99%
“…else calculate the maximum changeable window size M1. calculate the kernel matrix according to ( 21)- (23). when the size of the kernel matrix is restored to M1 calculate the kernel matrix according to ( 24)- (25).…”
Section: Improved Kernel Recursive Least Square Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…In the second part, the fusion of a Bayesian classifier is recommended as an effective solution to diagnose incipient interturn fault in the machine. To track the health status of a degraded system and predict the remaining service life of a turbofan engine, Zhou et al [87] propose a method that combines the echo state kernel recursive least-squares algorithm and a Bayesian technique, which demonstrates an excellent performance with respect to long-term prediction.…”
Section: Hybrid Techniquesmentioning
confidence: 99%
“…Outras configurações robustas da ESN abrem mão da habitual camada de saída linear por uma não linear, buscando apreender o máximo de informação oriunda do reservatória de dinâmicas. Uma dessas abordagensé a formulação usando máquina de vetor suporte (suport vector machine) com função custo robusta, tais como a função perda -insensível ( -insensitive loss) ou a função de Huber (Shi and Han, 2007).É possível também usar regras de aprendizado não lineares, como o algoritmo dos mínimos quadrados recursivo kernelizado (kernel recursive least squares) (Zhou et al, 2018). Apesar de não terem sido avaliadas diretamente em cenários com outliers, existem trabalhos em que arquiteturas ESN treinadas com o algoritmo Laplacian eigenmaps (Han and Xu, 2018) e modelos de regressão através de processos gaussianos (Chatzis and Demiris, 2011) podem ser capazes de lidar com os efeitos adversos causados por outliers.…”
Section: Introductionunclassified