“…). But V (M, P ) and V (M, Q) are calculated in ( 29) and (30). Hence we have D RJS (P, Q; π) ≥ 1 2 φ (1)V (P, Q) 2 .…”
Section: Bounds For Jsd Using the Decompositionmentioning
confidence: 96%
“…The notations and definitions, including Chan and Darwiche metric, φ-divergences, JSD, and reversed JSD, are presented in Section 2. Here JSD and reversed JSD are introduced by the notion of blending in [29]. The Pinsker inequality and reverse Pinsker inequality, see [41] are used for the first upper and lower bounds for JSD in terms of the variation distance in Section 3.…”
Section: Organization Of the Papermentioning
confidence: 99%
“…and similarly for D KL (Q, M ) with Q replacing P . When we insert the rightmost expressions in ( 29) and (30) in the right hand sides of the bounds on D KL (P, M ) and D KL (Q, M ) above, respectively, we obtain by the same computation as above the right hand side inequality in (27).…”
Section: Applications Of Pinsker and Reverse Pinsker Inequalitiesmentioning
confidence: 99%
“…P r o o f . We apply first (37) in the two terms in D JS (P, Q) and then ( 29) and (30) in the two KL-divergences to obtain a lower bound, where we set π = 1 2 .…”
Section: Lower Bounds For Jsd By Refinements Of Pinsker's Inequalitymentioning
We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a scaled Jeffreys' divergence and a reversed Jensen-Shannon divergence. Upper and lower bounds for the Jensen-Shannon divergence are then found in terms of the squared (total) variation distance. The derivations rely upon the Pinsker inequality and the reverse Pinsker inequality. We use these bounds to prove the asymptotic equivalence of the maximum likelihood estimate and minimum Jensen-Shannon divergence estimate as well as the asymptotic consistency of the minimum Jensen-Shannon divergence estimate. These are key properties for likelihood-free simulator-based inference.
“…). But V (M, P ) and V (M, Q) are calculated in ( 29) and (30). Hence we have D RJS (P, Q; π) ≥ 1 2 φ (1)V (P, Q) 2 .…”
Section: Bounds For Jsd Using the Decompositionmentioning
confidence: 96%
“…The notations and definitions, including Chan and Darwiche metric, φ-divergences, JSD, and reversed JSD, are presented in Section 2. Here JSD and reversed JSD are introduced by the notion of blending in [29]. The Pinsker inequality and reverse Pinsker inequality, see [41] are used for the first upper and lower bounds for JSD in terms of the variation distance in Section 3.…”
Section: Organization Of the Papermentioning
confidence: 99%
“…and similarly for D KL (Q, M ) with Q replacing P . When we insert the rightmost expressions in ( 29) and (30) in the right hand sides of the bounds on D KL (P, M ) and D KL (Q, M ) above, respectively, we obtain by the same computation as above the right hand side inequality in (27).…”
Section: Applications Of Pinsker and Reverse Pinsker Inequalitiesmentioning
confidence: 99%
“…P r o o f . We apply first (37) in the two terms in D JS (P, Q) and then ( 29) and (30) in the two KL-divergences to obtain a lower bound, where we set π = 1 2 .…”
Section: Lower Bounds For Jsd By Refinements Of Pinsker's Inequalitymentioning
We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a scaled Jeffreys' divergence and a reversed Jensen-Shannon divergence. Upper and lower bounds for the Jensen-Shannon divergence are then found in terms of the squared (total) variation distance. The derivations rely upon the Pinsker inequality and the reverse Pinsker inequality. We use these bounds to prove the asymptotic equivalence of the maximum likelihood estimate and minimum Jensen-Shannon divergence estimate as well as the asymptotic consistency of the minimum Jensen-Shannon divergence estimate. These are key properties for likelihood-free simulator-based inference.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.