2016
DOI: 10.1002/cjs.11285
|View full text |Cite
|
Sign up to set email alerts
|

Shannon entropy and Kullback–Leibler divergence in multivariate log fundamental skew‐normal and related distributions

Abstract: This paper mainly focuses on studying the Shannon Entropy and Kullback-Leibler divergence of the multivariate log canonical fundamental skew-normal (LCFUSN) and canonical fundamental skew-normal (CFUSN) families of distributions, extending previous works. We relate our results with other well known distributions entropies. As a byproduct, we also obtain the Mutual Information for distributions in these families.Shannon entropy is used to compare models fitted to analyze the USA monthly precipitation data. Kull… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 25 publications
0
5
0
Order By: Relevance
“…Moreover, several recent investigations confirmed the usefulness of entropic quantifiers in the study of asymmetric distributions [3,7,8] and their applications to topics such as thermal wake [9], marine fish biology [3,8], sea surface temperature (SST), relative humidity measured in the Atlantic Ocean [10], and more. We build on the study of [3], which developed hypothesis testing for normality, i.e., if the shape parameter is close to zero.…”
Section: Introductionmentioning
confidence: 88%
See 2 more Smart Citations
“…Moreover, several recent investigations confirmed the usefulness of entropic quantifiers in the study of asymmetric distributions [3,7,8] and their applications to topics such as thermal wake [9], marine fish biology [3,8], sea surface temperature (SST), relative humidity measured in the Atlantic Ocean [10], and more. We build on the study of [3], which developed hypothesis testing for normality, i.e., if the shape parameter is close to zero.…”
Section: Introductionmentioning
confidence: 88%
“…From (10), given that H(X ±ε ) only depends on shape parameter η, we obtain H(X ±ε ) = H(X), and H(Y) only depends on η and ε parameters. Therefore,…”
Section: Shannon Entropymentioning
confidence: 95%
See 1 more Smart Citation
“…Thanks to its characteristics, Shannon entropy is a popular method, not only for fault detection but also for other applications, such as for the analysis of biological signals [12], computational applications [13], and environmental data [14].…”
Section: Shannon Entropymentioning
confidence: 99%
“…. , M, are assumed to be equal its mean value (ϕ i ) = 0, µ i s (t) ≡ µ s (t), 1 ≤ s ≤ 4, moreover, taking into account stochastic representations of log-skew elliptical random vectors [39], the expressions for univariate and multivariate Shannon entropies (measured in nats) take the following forms [40]: σ 22 ,α 3 ,β 3 ,γ 3 ,σ 33 ,α 4 ,β 4 ,γ 4 ,σ 44 ,σ 12 ,σ 13 ,σ 14 ,σ 23 ,σ 24 ,σ 34 As we can see from Equation 45, the mutual information, I, is calculated directly by summing the individual entropies and subtracting the joint entropy. Mutual information, I, between two random variables, X s and X u , compares the uncertainty of measuring variables jointly with the uncertainty of measuring the two variables independently, identifies nonlinear dependence between two variables [41][42][43], and is non-negative and symmetrical.…”
Section: Information Measuresmentioning
confidence: 99%