2010
DOI: 10.3390/e12122497
|View full text |Cite
|
Sign up to set email alerts
|

Tsallis Entropy, Escort Probability and the Incomplete Information Theory

Abstract: Non-extensive statistical mechanics appears as a powerful way to describe complex systems. Tsallis entropy, the main core of this theory has been remained as an unproven assumption. Many people have tried to derive the Tsallis entropy axiomatically. Here we follow the work of Wang (EPJB, 2002) and use the incomplete information theory to retrieve the Tsallis entropy. We change the incomplete information axioms to consider the escort probability and obtain a correct form of Tsallis entropy in comparison with W… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2013
2013
2020
2020

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 16 publications
(7 citation statements)
references
References 22 publications
0
5
0
Order By: Relevance
“…43 The entropy can be organized by using the axioms of the incomplete information theory as in Equation 10: In conclusion, if the axioms of the information theory are changed in order to comprise the escort probability as an indicator of incomplete knowledge, the form of Tsallis entropy is obtained but in terms of the escort probability. 45 Kraskov Entropy.…”
Section: Entropy-based Feature Extraction Methodsmentioning
confidence: 99%
“…43 The entropy can be organized by using the axioms of the incomplete information theory as in Equation 10: In conclusion, if the axioms of the information theory are changed in order to comprise the escort probability as an indicator of incomplete knowledge, the form of Tsallis entropy is obtained but in terms of the escort probability. 45 Kraskov Entropy.…”
Section: Entropy-based Feature Extraction Methodsmentioning
confidence: 99%
“…It is most frequently used in crystallography [1], chemistry [2] and physics [3,4], but also in many other very diverse areas: natural language processing [5], transportation [6], character recognition [7], image processing [8,9], economy [10]. Theoretical developments also continue [11][12][13].…”
Section: Maximum Entropy Methodsmentioning
confidence: 99%
“…In non-extensive statistical physics, the quantity to be compared with the distribution of the observed system is not the original but its associated escort distribution [ 18 , 19 , 39 ]. The normalized cumulative distribution of the acoustic parameter , expressed as a q-exponential function, is obtained by integrating the probability density function : …”
Section: Theoretical Backgroundmentioning
confidence: 99%