2017
DOI: 10.3390/e19070361
|View full text |Cite|
|
Sign up to set email alerts
|

Estimating Mixture Entropy with Pairwise Distances

Abstract: Mixture distributions arise in many parametric and non-parametric settings-for example, in Gaussian mixture models and in non-parametric estimation. It is often necessary to compute the entropy of a mixture, but, in most cases, this quantity has no closed-form expression, making some form of approximation necessary. We propose a family of estimators based on a pairwise distance function between mixture components, and show that this estimator class has many attractive properties. For many distributions of inte… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
112
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
7
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 97 publications
(112 citation statements)
references
References 32 publications
0
112
0
Order By: Relevance
“…Moreover, one should note that although KNN based estimator or KDE is more data efficient, the adaptive bin size will jeopardize the shape of PDF, thus makes the estimator deviates from true mutual information values. In fact, the base estimator used in [36] just provides KDE-based lower and upper bounds on the true mutual information [95]. Interestingly, a recent paper by Noshad et al [93] suggested using dependence graphs to estimate true mutual information values and observed the compression phase even using ReLU activation functions.…”
Section: ) a Deeper Insight On The Role Of Information Theoretic Estmentioning
confidence: 99%
“…Moreover, one should note that although KNN based estimator or KDE is more data efficient, the adaptive bin size will jeopardize the shape of PDF, thus makes the estimator deviates from true mutual information values. In fact, the base estimator used in [36] just provides KDE-based lower and upper bounds on the true mutual information [95]. Interestingly, a recent paper by Noshad et al [93] suggested using dependence graphs to estimate true mutual information values and observed the compression phase even using ReLU activation functions.…”
Section: ) a Deeper Insight On The Role Of Information Theoretic Estmentioning
confidence: 99%
“…The mutual information involves the entropy of mixture distribution with intractable analytical form. Thus, pairwise-distances are adopted to provide lower bound and upper bound on the mutual information [41]. The results are shown in the following proposition for completeness.…”
Section: Durationmentioning
confidence: 99%
“…Maximization of information or, equivalently, maximization of the output entropy has been proposed by many authors (see, e.g., [13,15,17,18,[21][22][23]), but the mutual information is very hard to compute and the problem is often intractable. To overcome this serious difficulty, instead of mutual information the lower bound, given by Kolchinsky & Tracey [24], has been used. This is a pairwise-distance based entropy estimator and it it useful here, since it is differentiable, tight, and asymptotically reaches the maximum possible information (see [24]…”
Section: Introductionmentioning
confidence: 99%
“…To overcome this serious difficulty, instead of mutual information the lower bound, given by Kolchinsky & Tracey [24], has been used. This is a pairwise-distance based entropy estimator and it it useful here, since it is differentiable, tight, and asymptotically reaches the maximum possible information (see [24]…”
Section: Introductionmentioning
confidence: 99%