2012
DOI: 10.1111/j.1467-9469.2011.00774.x
|View full text |Cite|
|
Sign up to set email alerts
|

Shannon Entropy and Mutual Information for Multivariate Skew‐Elliptical Distributions

Abstract: Artículo de publicación ISIThe entropy and mutual information index are important concepts developed by Shannon in the context of information theory. They have been widely studied in the case of the multivariate normal distribution. We first extend these tools to the full symmetric class of multivariate elliptical distributions and then to the more flexible families of multivariate skew-elliptical distributions.We study in detail the cases of the multivariate skew-normal and skew-t distributions. We imple… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

4
79
0
1

Year Published

2013
2013
2023
2023

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 83 publications
(84 citation statements)
references
References 24 publications
4
79
0
1
Order By: Relevance
“…The SE of a localization-scale random variable X = µ + σZ does not depend on µ and is such that H(X) = log σ + H(Z) (see, e.g., [16]). The SE could serve to define a measure of disparity from normality, the so-called negentropy [17], which is zero for a Gaussian variable and positive for any distribution.…”
Section: Shannon Entropy and Related Measuresmentioning
confidence: 99%
See 1 more Smart Citation
“…The SE of a localization-scale random variable X = µ + σZ does not depend on µ and is such that H(X) = log σ + H(Z) (see, e.g., [16]). The SE could serve to define a measure of disparity from normality, the so-called negentropy [17], which is zero for a Gaussian variable and positive for any distribution.…”
Section: Shannon Entropy and Related Measuresmentioning
confidence: 99%
“…Given that the calculus of negentropy presents a computational challenge, where the integral involves the pdf of Z [16,18], different approximations of negentropy are used, such as cumulants' expansion series [17,19]. Withers and Nadarajah [19] provided exact and explicit series expansions for the SE and negentropy of a standardized pdf f on R, in terms of cumulants.…”
Section: Shannon Entropy and Related Measuresmentioning
confidence: 99%
“…In this section, we focus our attention principally on the skew-t model with ν (ν > 0) degrees of freedom (Branco and Dey, 2001;Azzalini and Capitanio, 2003;Arellano-Valle et al, 2012). This model follows by assuming in (3) that the mixing random factors v t are iid Gamma(ν/2, ν/2), i.e., with density given by…”
Section: The Skew-t Special Casementioning
confidence: 99%
“…The entropy quantifies such uncertainty in bits which matches the scale of the exposure value. According to [26], based on Shannon's definition of entropy: …”
Section: B Desgining Exposure Certainty Functionmentioning
confidence: 99%