2011
DOI: 10.1016/j.sigpro.2010.09.004
|View full text |Cite
|
Sign up to set email alerts
|

Correntropy: Implications of nonGaussianity for the moment expansion and deconvolution

Abstract: The recently introduced correntropy function is an interesting and useful similarity measure between two random variables which has found myriad applications in signal processing. A series expansion for correntropy in terms of higher-order moments of the difference between the two random variables has been used to try to explain its statistical properties for uses such as deconvolution. We examine the existence and form of this expansion, showing that it may be divergent, e.g., when the difference has the Lapl… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2013
2013
2022
2022

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 26 publications
0
4
0
Order By: Relevance
“…where σ is the kernel width, X and Y are two arbitrary random variables, E[•] is the expectation operator, κ(•) is a symmetric positive definite kernel function. Correntropy is calculated using the Gaussian kernel in most works found in the literature [15], [17] :…”
Section: B Maximum Correntropy Criterionmentioning
confidence: 99%
“…where σ is the kernel width, X and Y are two arbitrary random variables, E[•] is the expectation operator, κ(•) is a symmetric positive definite kernel function. Correntropy is calculated using the Gaussian kernel in most works found in the literature [15], [17] :…”
Section: B Maximum Correntropy Criterionmentioning
confidence: 99%
“…A careful analysis of (4) was considered in [4], where the authors show that the series may diverge depending on the distribution of the signal being considered. However, for shorter-tailed distributions such as the uniform, it is also possible to derive certain conditions for which the series converges [4]. Nevertheless, it is not necessary that the series exist in order that correntropy exist.…”
Section: V(x Y ) = E[g(x|y σmentioning
confidence: 99%
“…If its value is too large, correntropy will basically rely on second-order properties. On the other hand, if the value is too small, an undesirable behavior can be observed, in which the correntropy is dominated by moments of extremely highorder [4]. In that sense, σ plays a different role in correntropy, being related to weights on the statistical moments, while, in the information potential, σ is closely related to the shape of the distributions.…”
Section: V(x Y ) = E[g(x|y σmentioning
confidence: 99%
“…The criteria were presented encompassing aspects like the analytical calculus of the costs and their estimation, as well as their gradient calculus for the application in gradientbased methods for optimization. It is important to emphasize that the described method for analytical computation of the correntropy is also a contribution of this work (in the literature, the correntropy is analytically analyzed in terms of its Taylor series expansion [Principe, 2010;Yang et al, 2011]). Next, some statistical estimation issues like number of samples, number of delays and noise disturbances were analyzed, encompassing all considered criteria.…”
Section: Discussionmentioning
confidence: 99%