2018
DOI: 10.1088/1742-5468/aab1b6
|View full text |Cite
|
Sign up to set email alerts
|

Critical scaling of the mutual information in two-dimensional disordered Ising models

Abstract: Rényi Mutual information (RMI), computed from second Rényi entropies, can identify classical phase transitions from their finite-size scaling at the critical points. We apply this technique to examine the presence or absence of finite temperature phase transitions in various two-dimensional models on a square lattice, which are extensions of the conventional Ising model by adding a quenched disorder. When the quenched disorder causes the nearest neighbor bonds to be both ferromagnetic and antiferromagnetic, (a… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(11 citation statements)
references
References 34 publications
0
11
0
Order By: Relevance
“…Mutual information is an entropy-based quantification of the "shared information" of two random variables quantifying how knowledge of one decreases the uncertainty of the other and vice versa 54 . Mutual information peaks at the critical temperature of spin systems during second-order transitions and has been widely used in detecting phase transitions 55,56 ; an advantage of this statistic is its ability to quantify non-linear dependence, unlike Moran's I and covariance which only account for linear dependence.…”
Section: Methodsmentioning
confidence: 99%
“…Mutual information is an entropy-based quantification of the "shared information" of two random variables quantifying how knowledge of one decreases the uncertainty of the other and vice versa 54 . Mutual information peaks at the critical temperature of spin systems during second-order transitions and has been widely used in detecting phase transitions 55,56 ; an advantage of this statistic is its ability to quantify non-linear dependence, unlike Moran's I and covariance which only account for linear dependence.…”
Section: Methodsmentioning
confidence: 99%
“…Much efforts focused on the behaviour of the mutual information. For instance, it has been suggested [25][26][27][28][29][30] that the mutual information exhibits a crossing for different sizes at a finite-temperature critical point, similar to more traditional tools in critical phenomena, such as the Binder cumulant [31].…”
Section: Introductionmentioning
confidence: 95%
“…The sensitivity of MI to nonlinear dependence is certainly appealing in studies on GT [19,20,55] as well as in other fields of physics, including topological transition in the XY model [28], phase transition in a 2D disordered Ising model [53] and evaluation of the configurational entropy of liquid metals [25]. On the other hand, unlike the Pearson correlation coefficient, the MI evaluation requires knowledge of the distributions of random variables.…”
Section: Draftmentioning
confidence: 99%