2014
DOI: 10.1039/c4cp01729c
|View full text |Cite
|
Sign up to set email alerts
|

Information and complexity measures in molecular reactivity studies

Abstract: The analysis of the information and complexity measures as tools for the investigation of the chemical reactivity has been done in the spin-position and the position spaces, for the density and shape representations. The concept of the transferability and additivity of atoms or functional groups were used as "checkpoints" in the analysis of obtained results. The shape function as an argument of various measures reveals less information than the spinor density. Use of the shape function can yield wrong conclusi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(10 citation statements)
references
References 77 publications
0
10
0
Order By: Relevance
“…Other applications concern, for example, the use of Shannon entropy as indicator of avoided crossings in atomic spectroscopy for electronic systems in the presence of magnetic and electric fields [18] or the study of relevant chemical reactions like biomolecular nucleophilic substitutions reactions. [19][20][21][22] Moving forward, of particular interest for the discussion in this article is the recurrence of the idea of logarithm (and directly or indirectly of the Shannon entropy) to describe electronic correlations. The use of the logarithm of a distribution, and thus something strictly related to the Shannon entropy, as a statistical measure of the correlation strength was put forward, for example, by Gottlieb and Mauser.…”
Section: S52mentioning
confidence: 99%
“…Other applications concern, for example, the use of Shannon entropy as indicator of avoided crossings in atomic spectroscopy for electronic systems in the presence of magnetic and electric fields [18] or the study of relevant chemical reactions like biomolecular nucleophilic substitutions reactions. [19][20][21][22] Moving forward, of particular interest for the discussion in this article is the recurrence of the idea of logarithm (and directly or indirectly of the Shannon entropy) to describe electronic correlations. The use of the logarithm of a distribution, and thus something strictly related to the Shannon entropy, as a statistical measure of the correlation strength was put forward, for example, by Gottlieb and Mauser.…”
Section: S52mentioning
confidence: 99%
“…Originated from Claude Shannon's 1948 paper entitled “The mathematical theory of communication,” and still entertaining widespread applications in mathematics, statistics, computer science, physics, neurobiology and many other disciplines, information theory studies the quantification and communication of information, for which a probability function is often associated. Because the electron density can in principle be regarded as such a continuous local distribution function, ever since its early days of density functional theory (DFT), there have been tremendous interests in the literature to apply information theory to the electronic structure theory of atoms and molecules . This effort is usually categorized as the information‐theoretic approach (ITA) in DFT.…”
Section: Introductionmentioning
confidence: 99%
“…Furthermore, Tsallis statistics is relevant to certain systems of ultracold atoms [124][125][126][127]. Moreover, these IT quantities are applied to investigate electron correlation [30,31,33,39,72,[128][129][130][131][132][133][134][135][136] and various molecular properties as well [28,36,45,48,49,[59][60][61]72,73]. Calculated with the highly correlated Hylleraas wave functions, the results in this work then serve as a useful and reliable reference for the various applications mentioned above.…”
Section: Discussionmentioning
confidence: 95%