2017
DOI: 10.3390/e19080405
|View full text |Cite
|
Sign up to set email alerts
|

Intrinsic Losses Based on Information Geometry and Their Applications

Abstract: Abstract:One main interest of information geometry is to study the properties of statistical models that do not depend on the coordinate systems or model parametrization; thus, it may serve as an analytic tool for intrinsic inference in statistics. In this paper, under the framework of Riemannian geometry and dual geometry, we revisit two commonly-used intrinsic losses which are respectively given by the squared Rao distance and the symmetrized Kullback-Leibler divergence (or Jeffreys divergence). For an expon… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 41 publications
0
4
0
Order By: Relevance
“…This type of loss functions can be obtained directly from the measure of discrepancies between two probability density functions. In literature, the term 'intrinsic loss' is used alternatively to indicate this kind of loss functions (Rong, Tang, and Zhou 2017;Ni and Sun 2021).…”
Section: Kullback-leibler Distance Loss Functionmentioning
confidence: 99%
“…This type of loss functions can be obtained directly from the measure of discrepancies between two probability density functions. In literature, the term 'intrinsic loss' is used alternatively to indicate this kind of loss functions (Rong, Tang, and Zhou 2017;Ni and Sun 2021).…”
Section: Kullback-leibler Distance Loss Functionmentioning
confidence: 99%
“…If g(x) is a reversible function, the PDF of x and y satisfy: p y (y; θ) dg(x) dx = p x (x; θ). (7) According to the Fisher-Neyman factorization theorem [27], y is the sufficient statistic of x, so G S (θ) = G S (θ).…”
Section: Corollary 2 If G(x) Is a Reversible Function G S (θ) = G S (θ)mentioning
confidence: 99%
“…Given the Fisher information matrix as the Riemannian metric, the distance between any two points (probability distributions) can be calculated [6]. In such a manifold, the distance between two points stands for the intrinsic measure for the dissimilarity between two probability distributions [7]. As information geometry provides a new perspective on signal processing, there are many applications of it.…”
Section: Introductionmentioning
confidence: 99%
“…The Wasserstein distance as a geodesic distance on the Riemannian manifold equipped with the Wasserstein metric is usually used for optimal transportation [ 21 , 22 ], while the Fisher metric applies primarily to information science. Therefore, by endowing such space with a natural Riemannian structure (i.e., the Fisher metric and Levi–Civita connection), the resulting geodesic distance, also called the Rao distance, has been taken as an intrinsic measure for the dissimilarity between two PDFs and then applied in wide fields such as neural networks [ 23 , 24 ], signal processing [ 25 , 26 ], and statistical inference [ 27 , 28 ].…”
Section: Introductionmentioning
confidence: 99%