2019
DOI: 10.3390/e21050485
|View full text |Cite
|
Sign up to set email alerts
|

On the Jensen–Shannon Symmetrization of Distances Relying on Abstract Means

Abstract: The Jensen-Shannon divergence is a renown bounded symmetrization of the unbounded Kullback-Leibler divergence which measures the total Kullback-Leibler divergence to the average mixture distribution. However the Jensen-Shannon divergence between Gaussian distributions is not available in closed-form. To bypass this problem, we present a generalization of the Jensen-Shannon (JS) divergence using abstract means which yields closed-form expressions when the mean is chosen according to the parametric family of dis… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
113
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 149 publications
(113 citation statements)
references
References 56 publications
0
113
0
Order By: Relevance
“…We prove that by taking the middle ground, one can meaningfully quantify the causal effect with mutual information and conditional mutual information in an unconfounded setting. To this end, we employ the weighted Jensen-Shannon divergence [ 76 , 77 ], which is sensitive to more than just the first moment of a distribution, as a measure of difference between interventional distributions. We then show that SCE and ACE (Definitions 4 and 5) are equivalent to conditional mutual information and mutual information, respectively, when the difference of means is replaced with the Jensen-Shannon divergence.…”
Section: Proposed Methods For Causal Effect Identificationmentioning
confidence: 99%
See 1 more Smart Citation
“…We prove that by taking the middle ground, one can meaningfully quantify the causal effect with mutual information and conditional mutual information in an unconfounded setting. To this end, we employ the weighted Jensen-Shannon divergence [ 76 , 77 ], which is sensitive to more than just the first moment of a distribution, as a measure of difference between interventional distributions. We then show that SCE and ACE (Definitions 4 and 5) are equivalent to conditional mutual information and mutual information, respectively, when the difference of means is replaced with the Jensen-Shannon divergence.…”
Section: Proposed Methods For Causal Effect Identificationmentioning
confidence: 99%
“…Note that JSD is sometimes equivalently defined for as symmetrised Kullback-Leibler divergence between and : [ 77 ]. JSD has recently been applied in many machine learning areas such as GANs [ 78 ], bootstrapping [ 79 ], time series analysis [ 80 ] or computer vision [ 81 ].…”
Section: Proposed Methods For Causal Effect Identificationmentioning
confidence: 99%
“…Proof. (a) Since F and G are belonging to the same mixture family, JS(F, G) can be expressed as a Jensen-Bregman divergence [65]. Therefore, it can be written as:…”
Section: Conflicts Of Interestmentioning
confidence: 99%
“…Given this asymmetry, it is not a suitable candidate for a measure of covariate balance among groups: the divergence between two groups would depend upon which group is taken to be the Reference group. Jeffrey's divergence (J) is a symmetric version of relative entropy, defined as [7]. One reason why it is not a suitable candidate for the task of assessing covariate balance among groups is that there may be more than two groups.…”
Section: Relative Entropymentioning
confidence: 99%