2019
DOI: 10.3390/e21020112
|View full text |Cite
|
Sign up to set email alerts
|

Information Geometric Duality of ϕ-Deformed Exponential Families

Abstract: In the world of generalized entropies-which, for example, play a role in physical systems with sub-and super-exponential phasespace growth per degree of freedom-there are two ways for implementing constraints in the maximum entropy principle: linear-and escort constraints. Both appear naturally in different contexts. Linear constraints appear e.g. in physical systems, when additional information about the system is available through higher moments. Escort distributions appear naturally in the context of multif… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
14
0
1

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
2

Relationship

2
6

Authors

Journals

citations
Cited by 16 publications
(15 citation statements)
references
References 41 publications
0
14
0
1
Order By: Relevance
“…Here, we should notice that we are not satisfied with only achieving the smoothing purpose but aim to develop a sparse optimal policy. To achieve this goal, we define a class of proximity functions based on the κ-logarithm function (Korbel et al, 2019), i.e., d(x) = xφ(x) = − x 2 log κ (x), where log κ (x) = 1 1−κ (x 1−κ − 1) for x > 0 and κ = 1. In this paper, we consider a special case when k = 0, and refer to the operator (3.3) as the proximal Bellman operator.…”
Section: Proximal Bellman Operatormentioning
confidence: 99%
“…Here, we should notice that we are not satisfied with only achieving the smoothing purpose but aim to develop a sparse optimal policy. To achieve this goal, we define a class of proximity functions based on the κ-logarithm function (Korbel et al, 2019), i.e., d(x) = xφ(x) = − x 2 log κ (x), where log κ (x) = 1 1−κ (x 1−κ − 1) for x > 0 and κ = 1. In this paper, we consider a special case when k = 0, and refer to the operator (3.3) as the proximal Bellman operator.…”
Section: Proximal Bellman Operatormentioning
confidence: 99%
“…where the function Ψ (θ) is called the Massieu function and normalizes the distribution. As discussed in [26], there are two natural ways how to make a connection with the theory of information through the maximum entropy principle. The first is based on the maximization of the entropy functional under the linear (thermodynamic) constraints, the latter is based on a maximization under so-called escort (or geometric) constraints.…”
Section: Universality Classes For Scaling Expansionsmentioning
confidence: 99%
“…Alternatively, one can use the divergence of Csiszár type, but its information geometry is trivial, because it is conformal to ordinary Fisher information geometry, see e.g., Refs. [26,35]. Let us consider a parametric family of distributions p(θ).…”
Section: Universality Classes For Scaling Expansionsmentioning
confidence: 99%
See 1 more Smart Citation
“…Τέλος ας αναφερθεί ότι μετά τη δημοσίευση των άρθρων αυτής της διατριβής που αναφέρονται στην ενότητα (3), δημοσιεύθηκε το [41] ( 1)(( 1) 1) ( ) 2 (( 1) 1)log( )…”
Section: σύνοψηunclassified