1996
DOI: 10.1007/bf02330576
|View full text |Cite
|
Sign up to set email alerts
|

Mutual information of ising systems

Abstract: We obtain the mutual information of Ising systems, which shows singular behavior near the critical point. We connect the mutual information with the magnetization and the correlation function. The mutual information is a suitable measure for the critical behavior of Ising systems. I. I N T R O D U C T I O NLet AB be a joint system consisting of individual systems A and B. If A has states { et } and B has states { 13 }, AB has joint states { ctl3 }. The probability distributions of these systems are given by = … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

6
43
0

Year Published

2000
2000
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 47 publications
(49 citation statements)
references
References 7 publications
6
43
0
Order By: Relevance
“…It has been established previously that pairwise mutual information peaks at the phase transition for the 2D lattice Ising model [1,2]. Our results show that so too does pairwise transfer entropy.…”
Section: Critical Inverse Temperaturesupporting
confidence: 58%
See 1 more Smart Citation
“…It has been established previously that pairwise mutual information peaks at the phase transition for the 2D lattice Ising model [1,2]. Our results show that so too does pairwise transfer entropy.…”
Section: Critical Inverse Temperaturesupporting
confidence: 58%
“…This is essentially the quantity previously considered in Refs. [1,2]; Ref. [17] also considers the mutual information between two halves of a cylindrical 2D lattice.…”
mentioning
confidence: 99%
“…It has been analytically shown that, in a two-dimensional Ising model, the mutual information between joint states of two spin systems peaks at the critical temperature [33]. Barnett et al [26] show empirically that mutual information measured between pairs of neighboring spins peaks at the phase transition.…”
Section: Related Workmentioning
confidence: 99%
“…In particular, mutual information is adopted as a general measure of correlation between two systems. Mutual information, as well as entropy, have found significance in various applications in diverse fields, such as in analyzing experimental time series [37][38][39], in characterizing symbol sequences such as DNA sequences [40][41][42] and in providing a theoretical basis for the notion of complexity [43][44][45][46][47], just to name a few.…”
Section: Mathematical Toolsmentioning
confidence: 99%