1981
DOI: 10.1007/bf02576360
|View full text |Cite
|
Sign up to set email alerts
|

Informational divergence and the dissimilarity of probability distributions

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
17
0

Year Published

1990
1990
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 23 publications
(17 citation statements)
references
References 4 publications
0
17
0
Order By: Relevance
“…Mixed f -divergence, which is important in applications, such as statistical hypothesis testing and classification, see e.g., [35,40,61], measures the difference between multiple pairs of (probability) distributions. Examples include, e.g., the Matusita's affinity [33,34], the Toussaint's affinity [51], the information radius [48] and the average divergence [47]. Mixed f -divergence is an extension of the classical f -divergence and can be viewed as a vector form of classical f -divergence.…”
Section: Introductionmentioning
confidence: 99%
“…Mixed f -divergence, which is important in applications, such as statistical hypothesis testing and classification, see e.g., [35,40,61], measures the difference between multiple pairs of (probability) distributions. Examples include, e.g., the Matusita's affinity [33,34], the Toussaint's affinity [51], the information radius [48] and the average divergence [47]. Mixed f -divergence is an extension of the classical f -divergence and can be viewed as a vector form of classical f -divergence.…”
Section: Introductionmentioning
confidence: 99%
“…The Jensen-Shannon divergence (JSD) based on Kullback-Leibler divergence [9][10][11][12] is used to measure the similarity between two probability distributions. Let 𝐾(𝑃, 𝑄) be Kullback-Leibler divergence, then the Jensen-Shannon divergence is written in terms of Kullback-Leibler divergence is…”
Section: Departure From the Qua Modelmentioning
confidence: 99%
“…In this section we shall relate R-divergence parametric measures t V r s (θ) (t = 1 and 2) given by (4) to the Fisher's information measure I F X (θ) given by (5).…”
Section: Fisher Measure Of Information and Generalized R-divergence Measuresmentioning
confidence: 99%
“…Furthermore, the measure arising owing to concavity of Shannon's entropy known as information Radius [3], is also gaining importance towards applications. There exists a beautiful inequality [4,5] between information radius and J-divergence. Burbea and Rao [4,6] called information radius as Jensen difference divergence measure and also presented its one scalar parametric generalization.…”
Section: Introductionmentioning
confidence: 99%