2021
DOI: 10.48550/arxiv.2107.05901
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Fast approximations of the Jeffreys divergence between univariate Gaussian mixture models via exponential polynomial densities

Frank Nielsen

Abstract: The Jeffreys divergence is a renown symmetrization of the statistical Kullback-Leibler divergence which is often used in statistics, machine learning, signal processing, and information sciences in general. Since the Jeffreys divergence between the ubiquitous Gaussian Mixture Models are not available in closed-form, many techniques with various pros and cons have been proposed in the literature to either (i) estimate, (ii) approximate, or (iii) lower and/or upper bound this divergence. In this work, we propose… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 52 publications
(104 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?