2005
DOI: 10.1063/1.2121610
|View full text |Cite
|
Sign up to set email alerts
|

Information entropy, information distances, and complexity in atoms

Abstract: Shannon information entropies in position and momentum spaces and their sum are calculated as functions of Z (2 ≤ Z ≤ 54) in atoms. Roothaan-Hartree-Fock electron wave functions are used.The universal property S = a + b ln Z is verified. In addition, we calculate the Kullback-Leibler relative entropy, the Jensen-Shannon divergence, Onicescu's information energy and a complexity measure recently proposed. Shell effects at closed shells atoms are observed. The complexity measure shows local minima at the closed … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

6
68
0
1

Year Published

2009
2009
2018
2018

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 165 publications
(75 citation statements)
references
References 39 publications
6
68
0
1
Order By: Relevance
“…To underline the difference between the quantum entropy and information energy, let us consider a simple example that justifies the study of the latter; namely, it is easy to show [69,70] that for the discrete field with N events the information energy (entropy) reaches minimum of 1/N (maximum of ln N ) when the likelihoods of all occurences are equal while the unit maximum (zero minimum) takes place with the probability of one event being certain with all others turning to zeros. Since the former case corresponds to a complete disorder, by analogy with thermodynamics the quantity O is coined as 'energy' though actually it is measured in units of the inverse volume of the field upon which it is calculated.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…To underline the difference between the quantum entropy and information energy, let us consider a simple example that justifies the study of the latter; namely, it is easy to show [69,70] that for the discrete field with N events the information energy (entropy) reaches minimum of 1/N (maximum of ln N ) when the likelihoods of all occurences are equal while the unit maximum (zero minimum) takes place with the probability of one event being certain with all others turning to zeros. Since the former case corresponds to a complete disorder, by analogy with thermodynamics the quantity O is coined as 'energy' though actually it is measured in units of the inverse volume of the field upon which it is calculated.…”
Section: Introductionmentioning
confidence: 99%
“…Similar to the quantum entropy and Fisher information, it finds applications not only in physics but, for example, in social sciences [71]. A comparison of the three information measures has been performed for a number of systems [40,[69][70][71][72]. For the only bound state of the negative Robin wall the information energies are:…”
Section: Introductionmentioning
confidence: 99%
“…Using different phase-spaces to measure Shannon entropy will lead to different expressions. Studies of Shannon entropy in position space and momentum space of atomic systems have been carried out [7][8][9][10][11][12][13][14]. We calculate Shannon entropy in position basis and momentum basis to discuss the correlation.…”
Section: Open Accessmentioning
confidence: 99%
“…In [61] the Jensen-Shannon distribution was first proposed as a measure of distinguishability of two quantum states. Chatzisavvas et al investigated the quantity for atomic density functions [62].…”
Section: Kullback-leibler Missing Informationmentioning
confidence: 99%