2006
DOI: 10.1016/j.physa.2006.01.027
|View full text |Cite
|
Sign up to set email alerts
|

Unique additive information measures—Boltzmann–Gibbs–Shannon, Fisher and beyond

Abstract: It is proved that the only additive and isotropic information measure that can depend on the probability distribution and also on its first derivative is a linear combination of the Boltzmann-Gibbs-Shannon and Fisher information measures. Power law equilibrium distributions are found as a result of the interaction of the two terms. The case of second order derivative dependence is investigated and a corresponding additive information measure is given.

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
3
0

Year Published

2008
2008
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 28 publications
0
3
0
Order By: Relevance
“…A more prestigious attempt is to set such phenomena into a united framework of non-extensive thermodynamics, based on certain generalizations of familiar basic formulas. In particular generalizations of the Boltzmann -Gibbs -Shannon entropy formula were seeked as funding stones for such a general treatment [1,2,3,4,5,6,7,8].…”
Section: Introductionmentioning
confidence: 99%
“…A more prestigious attempt is to set such phenomena into a united framework of non-extensive thermodynamics, based on certain generalizations of familiar basic formulas. In particular generalizations of the Boltzmann -Gibbs -Shannon entropy formula were seeked as funding stones for such a general treatment [1,2,3,4,5,6,7,8].…”
Section: Introductionmentioning
confidence: 99%
“…Reversely, the additivity of entropy is -in some cases-achieved by non-product probabilities, trying to grasp the essence of surviving correlations (surmised to occur due to long-range interactions) in systems, which are large in the thermodynamical sense [24]. The additivity of entropy also can be achieved by weakly non-local extensions [25,26]. The connection between generalized, among them non-Boltzmannian probability distributions and the thermodynamic entropy formula was clarified in a recent paper [27].…”
mentioning
confidence: 99%
“…where the macro state of an N-particle physical system is given as D n . The distribution of N particles according to n class is shown as follows [9,10]:…”
Section: Definitionmentioning
confidence: 99%