2006
DOI: 10.1016/j.physa.2005.06.072
|View full text |Cite
|
Sign up to set email alerts
|

Nonextensive triangle equality and other properties of Tsallis relative-entropy minimization

Abstract: Kullback-Leibler relative-entropy has unique properties in cases involving distributions resulting from relative-entropy minimization. Tsallis relative-entropy is a one parameter generalization of Kullback-Leibler relative-entropy in the nonextensive thermostatistics. In this paper, we present the properties of Tsallis relative-entropy minimization and present some differences with the classical case. In the representation of such a minimum relative-entropy distribution, we highlight the use of the q-product, … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
18
1

Year Published

2009
2009
2020
2020

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 24 publications
(19 citation statements)
references
References 28 publications
0
18
1
Order By: Relevance
“…Secondly, the KL-divergence plays a central role in the Principle of minimum of relative Boltzmann-Gibbs entropy, MinxEnt. According to known results (see, e.g., Borland et al, 1998, Dukkipati et al, 2006, Tsallis 2008, etc. ), the distribution m ( x ) that provides minimum for I KL [ m : r ] given the mean value of m equal to s , is the Boltzmann distribution…”
Section: Dynamical Principles Of Minimum Of Shannon Information Lossmentioning
confidence: 90%
“…Secondly, the KL-divergence plays a central role in the Principle of minimum of relative Boltzmann-Gibbs entropy, MinxEnt. According to known results (see, e.g., Borland et al, 1998, Dukkipati et al, 2006, Tsallis 2008, etc. ), the distribution m ( x ) that provides minimum for I KL [ m : r ] given the mean value of m equal to s , is the Boltzmann distribution…”
Section: Dynamical Principles Of Minimum Of Shannon Information Lossmentioning
confidence: 90%
“…For this we will use an analogue of Pythagoras' theorem for relative entropies as first derived by Csiszar [9]. It has also been called the triangle equality by others [10]:…”
Section: Combining the Different Concepts In One Equationmentioning
confidence: 99%
“…Specifying q = 2 − q * in the numerator of (38) and evoking (4), yields the canonical transition probability…”
Section: Case Formentioning
confidence: 99%
“…• A-priori specifying the nonextensivity parameter q and the effective nonextensive trade-off parameter for a single source alphabet β * (x) obtained from (38) for all source alphabets, the expected distortion D =< d(x,x) > p(x,x) is obtained. Choosing a random data point inX (the convex set of probability distributions B, described in Sections 4.1 and 4.2), an initial guess for p(x) is made.…”
Section: Nonextensive Alternating Minimization Algorithm Revisitedmentioning
confidence: 99%