2015
DOI: 10.3390/info6030361
|View full text |Cite
|
Sign up to set email alerts
|

A Class of New Metrics Based on Triangular Discrimination

Abstract: In the field of information theory, statistics and other application areas, the information-theoretic divergences are used widely. To meet the requirement of metric properties, we introduce a class of new metrics based on triangular discrimination which are bounded. Moreover, we obtain some sharp inequalities for the triangular discrimination and other information-theoretic divergences. Their asymptotic approximation properties are also involved.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 11 publications
0
3
0
Order By: Relevance
“…A list of some notable existing f -divergence inequalities is provided, e.g., in [ 22 ] Section 1 and [ 23 ] Section 3. State-of-the-art techniques which serve to derive bounds among f -divergences include: Moment inequalities which rely on log-convexity arguments ([ 22 ] Section 5.D, [ 24 , 25 , 26 , 27 , 28 ]); Inequalities which rely on a characterization of the exact locus of the joint range of f -divergences [ 29 ]; f -divergence inequalities via functional domination ([ 22 ] Section 3, [ 30 , 31 , 32 ]); Sharp f -divergence inequalities by using numerical tools for maximizing or minimizing an f -divergence subject to a finite number of constraints on other f -divergences [ 33 ]; Inequalities which rely on powers of f -divergences defining a distance [ 34 , 35 , 36 , 37 ]; Vajda and Pinsker-type inequalities for f -divergences ([ 4 , 10 , 13 , 22 ] Sections 6–7, [ 38 , 39 ]); Bounds among f -divergences when the relative information is bounded ([ 22 ] Sections 4–5, [ 40 , 41 , 42 , 43 , 44 , 45 , 46 , 47 ]), and reverse Pinsker inequalities ([ 22 ] Section 6, [ 40 , 48 ]); Inequalities which rely on the minimum of an ...…”
Section: Introductionmentioning
confidence: 99%
“…A list of some notable existing f -divergence inequalities is provided, e.g., in [ 22 ] Section 1 and [ 23 ] Section 3. State-of-the-art techniques which serve to derive bounds among f -divergences include: Moment inequalities which rely on log-convexity arguments ([ 22 ] Section 5.D, [ 24 , 25 , 26 , 27 , 28 ]); Inequalities which rely on a characterization of the exact locus of the joint range of f -divergences [ 29 ]; f -divergence inequalities via functional domination ([ 22 ] Section 3, [ 30 , 31 , 32 ]); Sharp f -divergence inequalities by using numerical tools for maximizing or minimizing an f -divergence subject to a finite number of constraints on other f -divergences [ 33 ]; Inequalities which rely on powers of f -divergences defining a distance [ 34 , 35 , 36 , 37 ]; Vajda and Pinsker-type inequalities for f -divergences ([ 4 , 10 , 13 , 22 ] Sections 6–7, [ 38 , 39 ]); Bounds among f -divergences when the relative information is bounded ([ 22 ] Sections 4–5, [ 40 , 41 , 42 , 43 , 44 , 45 , 46 , 47 ]), and reverse Pinsker inequalities ([ 22 ] Section 6, [ 40 , 48 ]); Inequalities which rely on the minimum of an ...…”
Section: Introductionmentioning
confidence: 99%
“…The squared Hellinger distance can be related to other measures of divergence. For instance, it is of the same order as the Jensen-Shannon divergence and the triangular discrimination [15], [16]. Thus, the bound of Theorem 6 can be stated, up to small constants, in terms of these measures as well.…”
Section: The Two Policies Casementioning
confidence: 88%
“…In order to study the affect of jet radius on separation power, samples are generated for each topology with jet radii in the range 0.2 ≤ R ≤ 1.5 in steps of 0.1. 5 In the language of information theory, this is closely related to the χ 2 divergence; both are fdivergences [49][50][51] with f (u) = (u − 1) 2 /(u + 1) for the classifier separation and f (u) = (u − 1) 2 for the χ 2 divergence [52,53]. We are grateful to Ben Elder, who pointed out to us that this quantity has also been referred to as the triangular discriminator in the information theory literature [54].…”
Section: Baseline Analysismentioning
confidence: 99%