2020
DOI: 10.3390/e22080879
|View full text |Cite
|
Sign up to set email alerts
|

Gintropy: Gini Index Based Generalization of Entropy

Abstract: Entropy is being used in physics, mathematics, informatics and in related areas to describe equilibration, dissipation, maximal probability states and optimal compression of information. The Gini index, on the other hand, is an established measure for social and economical inequalities in a society. In this paper, we explore the mathematical similarities and connections in these two quantities and introduce a new measure that is capable of connecting these two at an interesting analogy level. This supports the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
20
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 28 publications
(20 citation statements)
references
References 27 publications
0
20
0
Order By: Relevance
“…We then conclude that the Gini index (g) corresponds to free energy (F ) or entropy (S) (cf. [16,17]) and the Kolkata index (k) corresponds to the inverse temperature (1/T ) of an equivalent thermodynamic system.…”
Section: Summary and Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…We then conclude that the Gini index (g) corresponds to free energy (F ) or entropy (S) (cf. [16,17]) and the Kolkata index (k) corresponds to the inverse temperature (1/T ) of an equivalent thermodynamic system.…”
Section: Summary and Discussionmentioning
confidence: 99%
“…This indicates that Gini index g ∼ F , where F denotes entropy (see e.g. [16,17]) in the CC model and that the Kolkata index k, given by L…”
Section: Thermodynamic Mapping Of G and K Indicesmentioning
confidence: 99%
“…2 and 3) that as inequality increases (with increasing n) from equality k = 0.5 and g = 0 for n = 1 to extreme inequality k = g = 1 as n → ∞, k has a non-monotonic variation with respect to g such that k and g crosses at k = g ≃ 0.86 and they finally meet at k = g = 1. As the Gini index (g) is identified (see [21]) as the information entropy TABLE I. Statistical analysis of the papers and their citations for 100 'randomly chosen' scientists (including 20 Nobel Laureates; denoted by * before their names) in physics (Phys), chemistry (Chem), biology/physiology/medicine (Bio), mathematics (Maths), economics (Econ) and sociology (Soc), having individual Google Scholar page (with 'verifiable email site') and having at least 100 entries (papers or documents, latest not before 2018), with Hirsch index (h) [17] value 20 or above.…”
Section: Summary and Discussionmentioning
confidence: 99%
“…Equation ( 27) could also be derived in a different manner by using a generalized definition of entropy, the so-called Havrda-Charvát-Tsallis entropy [121,122]. This has been used in several econophysics studies (e.g., [123,124]). However, we contend that the derivation using the classical (Boltzmann-Gibbs-Shannon) definition (and an appropriate, non-Lebesgue background measure) is more natural and advantageous as it satisfies Shannon's postulates and retains the properties resulting from these postulates.…”
Section: Hyperbolic Background Measure and The Pareto Distributionmentioning
confidence: 99%
“…Interestingly, Chakrabarti et al [53] (p. 45) characterized a society with exponential wealth distribution as "super-fair", and one with Pareto distribution "fair" or "unfair", depending on the tail index and other parameters. Biró and Néda [124], referring to income or wealth distributions, characterized exponential distribution as "natural", and Pareto distribution as "capitalism", and they provide some additional cases ("communism", "communism++", "eco-window"). Here we adopt the name "natural" for the exponential distribution, because it corresponds to the Lebesgue background measure where the distance between two values equals their difference (see discussion about the distance in [80,118]).…”
Section: Hyperbolic Background Measure and The Pareto Distributionmentioning
confidence: 99%