2021
DOI: 10.1049/ccs2.12030
|View full text |Cite
|
Sign up to set email alerts
|

Minimum error entropy criterion‐based randomised autoencoder

Abstract: The extreme learning machine-based autoencoder (ELM-AE) has attracted a lot of attention due to its fast learning speed and promising representation capability. However, the existing ELM-AE algorithms only reconstruct the original input and generally ignore the probability distribution of the data. The minimum error entropy (MEE), as an optimal criterion considering the distribution statistics of the data, is robust in handling non-linear systems and non-Gaussian noises. The MEE is equivalent to the minimisati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
1

Relationship

2
2

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 46 publications
0
4
0
Order By: Relevance
“…Correntropy is a new concept that measures the similarity of two random variables [19, 20, 26]. Consider two random variables X and Y , with a given joint density function FX,Y()x,y ${F}_{X,Y}\left(x,y\right)$, and the correntropy C()X,Y $C\left(X,Y\right)$ is defined as the integral of the error under the MCC, that is, the kernel transformation of the error.…”
Section: Gaussian‐sum and Maximum Correntropy Adaptive Ckfmentioning
confidence: 99%
See 2 more Smart Citations
“…Correntropy is a new concept that measures the similarity of two random variables [19, 20, 26]. Consider two random variables X and Y , with a given joint density function FX,Y()x,y ${F}_{X,Y}\left(x,y\right)$, and the correntropy C()X,Y $C\left(X,Y\right)$ is defined as the integral of the error under the MCC, that is, the kernel transformation of the error.…”
Section: Gaussian‐sum and Maximum Correntropy Adaptive Ckfmentioning
confidence: 99%
“…[19, 26] applied the UKF method, which improves the objective function with the smallest covariance in the Kalman filter and processes the error in the form of a kernel function. This idea is also applied to neural network data processing methods [20], proving that the entropy theory can get more information and more accurate results. Figure 2 shows the MCC kernel function κ()x,y $\kappa \left(x,y\right)$ in the space of x and y .…”
Section: Gaussian‐sum and Maximum Correntropy Adaptive Ckfmentioning
confidence: 99%
See 1 more Smart Citation
“…Recently, the randomized autoencoder (RAE) especially for the randomized neural network (RNN) [1][2][3][4] based autoencoder (RNN-AE) has attracted much attention due to its advantages of fast learning speed, ease of implementation and less human-intervention [5][6][7][8][9][10][11][12][13][14][15][16]. The RNN-AE can be tracked to [5] that uses random hidden-layer parameters without tuning and only trains the output weight for representation learning.…”
Section: Introductionmentioning
confidence: 99%