2018
DOI: 10.1103/physrevb.98.060301
|View full text |Cite
|
Sign up to set email alerts
|

Learning phase transitions from dynamics

Abstract: We propose the use of recurrent neural networks for classifying phases of matter based on the dynamics of experimentally accessible observables. We demonstrate this approach by training recurrent networks on the magnetization traces of two distinct models of one-dimensional disordered and interacting spin chains. The obtained phase diagram for a well-studied model of the many-body localization transition shows excellent agreement with previously known results obtained from time-independent entanglement spectra… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
56
0

Year Published

2018
2018
2020
2020

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 70 publications
(56 citation statements)
references
References 61 publications
0
56
0
Order By: Relevance
“…Neural networks have become a standard tool to tackle problems where we want to make predictions without following a particular algorithm or imposing structure on the available data (see for example [32][33][34]) and they have been applied to a wide variety of problems in physics. For example, in condensed matter physics and generally in many-body settings, neural networks have proven particularly useful to characterize phase transitions [4][5][6][7][8][9]. The aim of these works is to optimise the accuracy of the predictions, but not to extract information on what the network learned during training.…”
Section: Comparison With Previous Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Neural networks have become a standard tool to tackle problems where we want to make predictions without following a particular algorithm or imposing structure on the available data (see for example [32][33][34]) and they have been applied to a wide variety of problems in physics. For example, in condensed matter physics and generally in many-body settings, neural networks have proven particularly useful to characterize phase transitions [4][5][6][7][8][9]. The aim of these works is to optimise the accuracy of the predictions, but not to extract information on what the network learned during training.…”
Section: Comparison With Previous Workmentioning
confidence: 99%
“…While neural networks have been applied to a variety of problems in physics, most work to date has focused on the efficiency or quality of predictions of neural networks, without an understanding how they solve the problem [4][5][6][7][8][9] (see Section 4.1 and [10] for a review and further references). Other algorithms and neural network architectures have been developed to produce a physical model by imposing some structure on the space of solutions and on the input data [11][12][13][14][15][16][17][18].…”
Section: Introductionmentioning
confidence: 99%
“…To further corroborate our analysis, we apply methods from machine learning [57], which has emerged recently as a powerful tool to analyze localization phenomena [58][59][60][61][62][63], to our data obtained using the TDVP. We use two algorithms: a partially supervised approach that has previously been employed in Ref.…”
Section: Machine Learningmentioning
confidence: 99%
“…It provides an example of a generic quantum many-body system that cannot reach thermal equilibrium [4][5][6][7]. In recent years, an enormous theoretical effort was invested in understanding the nature of the MBL transition [8][9][10], the dynamical [11][12][13] and entanglement [14][15][16][17] properties of these systems and their response to external probes [18,19] and periodic driving [20][21][22]. Also the experimental community [23][24][25][26][27] has found interest in this field, in particular, because these systems have the potential of storing information about initial states for long times, and hence may implement quantum memory devices.…”
Section: Introductionmentioning
confidence: 99%