2018
DOI: 10.3390/e20050383
|View full text |Cite
|
Sign up to set email alerts
|

On f-Divergences: Integral Representations, Local Behavior, and Inequalities

Abstract: This paper is focused on f -divergences, consisting of three main contributions. The first one introduces integral representations of a general f -divergence by means of the relative information spectrum. The second part provides a new approach for the derivation of f -divergence inequalities, and it exemplifies their utility in the setup of Bayesian binary hypothesis testing. The last part of this paper further studies the local behavior of f -divergences.

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
38
0

Year Published

2019
2019
2025
2025

Publication Types

Select...
7
2

Relationship

1
8

Authors

Journals

citations
Cited by 44 publications
(39 citation statements)
references
References 55 publications
1
38
0
Order By: Relevance
“…where u α : (0, ∞) → R is a non-negative and convex function with u α (1) = 0, which is defined for t > 0 as follows (see [38,Chapter 2], followed by studies in, e.g., [5], [16], [39], [49] and [62]):…”
Section: Illustration Of Theorem 7 and Further Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…where u α : (0, ∞) → R is a non-negative and convex function with u α (1) = 0, which is defined for t > 0 as follows (see [38,Chapter 2], followed by studies in, e.g., [5], [16], [39], [49] and [62]):…”
Section: Illustration Of Theorem 7 and Further Resultsmentioning
confidence: 99%
“…Letting α * := max{α 1 , α 2 } gives the result in (66) for all α > α * . Item f) of Theorem 5 is a direct consequence of [62,Lemma 4], which relies on [50, Theorem 3]. Let g(t) := (t − 1) 2 for t ≥ 0 (hence, D g (· ·) is the χ 2 divergence).…”
Section: Appendix D Proof Of Theoremmentioning
confidence: 99%
“…Let P and Q be distributions defined on a common probability space that have densities p and q with respect to a dominating measure . The relative entropy (or Kullback–Leibler divergence) is defined according to and the chi-squared divergence is defined as Both of these divergences can be seen as special cases of the general class of f -divergence measures and there exists a rich literature on comparisons between different divergences [ 8 , 26 , 27 , 28 , 29 , 30 , 31 , 32 ]. The chi-squared divergence can also be viewed as the squared distance between and .…”
Section: Bounds On Mutual Informationmentioning
confidence: 99%
“…The α-Jensen–Shannon divergences are Csiszár f -divergences [ 22 , 23 , 24 ]. An f -divergence is defined for a convex function f , strictly convex at 1, and satisfies as: …”
Section: Introductionmentioning
confidence: 99%