2021
DOI: 10.48550/arxiv.2101.12459
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

On $f$-divergences between Cauchy distributions

Abstract: We prove that the f -divergences between univariate Cauchy distributions are always symmetric and can be expressed as functions of the chi-squared divergence. We show that this property does not hold anymore for multivariate Cauchy distributions. We then present several metrizations of f -divergences between univariate Cauchy distributions.

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
14
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
3
2

Relationship

2
3

Authors

Journals

citations
Cited by 6 publications
(16 citation statements)
references
References 48 publications
2
14
0
Order By: Relevance
“…In general, a sufficient condition for expressing the f -divergence as a series of order-k chi divergences is given in [21]. However, the condition…”
Section: Some Illustrating Examplesmentioning
confidence: 99%
See 1 more Smart Citation
“…In general, a sufficient condition for expressing the f -divergence as a series of order-k chi divergences is given in [21]. However, the condition…”
Section: Some Illustrating Examplesmentioning
confidence: 99%
“…However, even for asymmetric f -divergences like the Kullback-Leibler divergence, some parametric families of distributions yield symmetric divergences. This is the case for the f -divergences between isotropic Gaussian distributions or f -divergences between Cauchy distributions which are always symmetric [21].…”
Section: Introductionmentioning
confidence: 99%
“…In [14], the following closed-form formula (using complex analysis in C) was proven for the Kullback-Leibler divergence between a Cauchy density and a mixture of two Cauchy densities:…”
Section: Information Geometry Of the Mixture Family Of Two Cauchy Dis...mentioning
confidence: 99%
“…In [14], the following closed-form formula was reported for the Jensen-Shannon divergence when θ = 1 2 :…”
Section: Information Geometry Of the Mixture Family Of Two Cauchy Dis...mentioning
confidence: 99%
See 1 more Smart Citation