2018
DOI: 10.48550/arxiv.1805.11257
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

The Differential Entropy of Mixtures: New Bounds and Applications

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(5 citation statements)
references
References 0 publications
0
5
0
Order By: Relevance
“…The following appears as Theorem III.1 in the preprint [14]. It states that skewing an fdivergence preserves its status as such.…”
Section: Skew Divergencesmentioning
confidence: 94%
See 3 more Smart Citations
“…The following appears as Theorem III.1 in the preprint [14]. It states that skewing an fdivergence preserves its status as such.…”
Section: Skew Divergencesmentioning
confidence: 94%
“…A proof is given in the appendix for the convenience of the reader. Theorem 3.1 (Melbourne et al [14]). For t, s ∈ [0, 1] and an f -divergence, D f (•||•), in the sense that…”
Section: Skew Divergencesmentioning
confidence: 99%
See 2 more Smart Citations
“…In [19], the authors used the principle of entropy concavity deficit to introduce bounds on the differential entropy of a Gaussian mixture. This principle uses the difference between the differential entropy of the mixture and the weighted sum of differential entropies of its constituent components.…”
Section: Introductionmentioning
confidence: 99%