2012
DOI: 10.3390/e14091606
|View full text |Cite
|
Sign up to set email alerts
|

Kullback–Leibler Divergence Measure for Multivariate Skew-Normal Distributions

Abstract: The aim of this work is to provide the tools to compute the well-known Kullback-Leibler divergence measure for the flexible family of multivariate skew-normal distributions. In particular, we use the Jeffreys divergence measure to compare the multivariate normal distribution with the skew-multivariate normal distribution, showing that this is equivalent to comparing univariate versions of these distributions. Finally, we applied our results on a seismological catalogue data set related to the 2010 Maule earthq… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
31
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 64 publications
(31 citation statements)
references
References 22 publications
0
31
0
Order By: Relevance
“…Inserting the ML estimation (fixed) parameters represents the simplest evaluation of these bounds [22]. However, between the lower and upper Rényi entropy bounds exists a considerable distance.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Inserting the ML estimation (fixed) parameters represents the simplest evaluation of these bounds [22]. However, between the lower and upper Rényi entropy bounds exists a considerable distance.…”
Section: Methodsmentioning
confidence: 99%
“…For the case m = 1, Contreras-Reyes and Arellano-Valle [22] consider the upper bound of the property (i) of Proposition 2 to approximate the Shannon entropy of an SN distribution using the property (ii) of Proposition 2. In this Proposition 2(ii), the left side includes an integral related to a product of two skew-normal densities.…”
Section: Propositionmentioning
confidence: 99%
“…Contreras-Reyes and Arellano-Valle [6] considered the result of Kupperman [30] to develop an asymptotic test of complete homogeneity in terms of the J divergence between two SN distributions. The SN distribution satisfies all the aforementioned regularity conditions when skewness parameter η = 0.…”
Section: Two-sample Casementioning
confidence: 99%
“…In both cases, we compare the SE and negentropies obtained from their series expansions with their corresponding "exact" versions computed from the Quadpack numerical integration method of Piessens et al [31]. More precisely, the "exact" expected values E{ζ 0 (τZ τ )} and E{ζ 0 (τZ * τ )} are computed using the Quadpack method as in Arellano-Valle et al [16], Contreras-Reyes and Arellano-Valle [6] or Contreras-Reyes [18]. From the series expansions, the SE and negentropies were carried out for k = 12 as in Withers and Nadarajah [19].…”
Section: Simulationsmentioning
confidence: 99%
See 1 more Smart Citation