2008
DOI: 10.3390/entropy-e10030200
|View full text |Cite
|
Sign up to set email alerts
|

Calculation of Differential Entropy for a Mixed Gaussian Distribution

Abstract: In this work, an analytical expression is developed for the differential entropy of a mixed Gaussian distribution. One of the terms is given by a tabulated function of the ratio of the distribution parameters

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
51
0

Year Published

2012
2012
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 57 publications
(52 citation statements)
references
References 11 publications
1
51
0
Order By: Relevance
“…To complement our analysis above using L, we now look at how the (Gibbs) differential entropy S(t) = − dx p(x,t) ln p(x,t) (e.g., see [50], and using units where the Boltzmann constant K B = 1) changes during the phase transition. It is important to note that S differs from L in that it only depends on p at any instant in time, but not on the evolution that led to that PDF.…”
Section: Entropymentioning
confidence: 99%
“…To complement our analysis above using L, we now look at how the (Gibbs) differential entropy S(t) = − dx p(x,t) ln p(x,t) (e.g., see [50], and using units where the Boltzmann constant K B = 1) changes during the phase transition. It is important to note that S differs from L in that it only depends on p at any instant in time, but not on the evolution that led to that PDF.…”
Section: Entropymentioning
confidence: 99%
“…It thus generalizes the bounds reported in [7] to GMMs with arbitrary variances that are not necessarily equal.…”
Section: Without Loss Of Generality Consider Gmms In the Formmentioning
confidence: 75%
“…Alternatively, we can also use the base-2 logarithm (log 2 x = log x log 2 ) and get the entropy expressed in bit units. Although the KL divergence is available in closed-form for many distributions (in particular as equivalent Bregman divergences for exponential families [5], see Appendix C), it was proven that the Kullback-Leibler divergence between two (univariate) GMMs is not analytic [6] (see also the particular case of a GMM of two components with the same variance that was analyzed in [7]). See Appendix A for an analysis.…”
Section: Introductionmentioning
confidence: 99%
“…In a variety of systems, adaptation takes place by inflating or deflating information, so that the "right" balance is achieved. Certainly, given that it is possible to derive upper and lower bounds for the differential entropy of a PDF (e.g., [20]), it should be also possible to define analytical bounds for the complexity measures for the given PDF. However, for practical purposes, complexity measures are constrained by the selected range of the PDF parameters.…”
Section: Discussionmentioning
confidence: 99%