2017
DOI: 10.1103/physreve.96.022140
|View full text |Cite
|
Sign up to set email alerts
|

Unsupervised learning of phase transitions: From principal component analysis to variational autoencoders

Abstract: We employ unsupervised machine learning techniques to learn latent parameters which best describe states of the two-dimensional Ising model and the three-dimensional XY model. These methods range from principal component analysis to artificial neural network based variational autoencoders. The states are sampled using a Monte-Carlo simulation above and below the critical temperature. We find that the predicted latent parameters correspond to the known order parameters. The latent representation of the states o… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

6
350
0
2

Year Published

2018
2018
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 442 publications
(358 citation statements)
references
References 36 publications
6
350
0
2
Order By: Relevance
“…The research literature involving latent spaces often employs visualizations for qualitative evaluation. Most of these visualizations are 2D scatter plots, using two intrinsic latent dimensions directly [Wet17, MNG17, YHSBK17], axes from dimensionality reduction techniques such as t‐SNE ( e.g ., [TWR∗17, JYY∗16, HSSQ17, MKS∗15, YSD∗18]) or PCA ( e.g ., [SRM∗16, MSSW16,UFDR16,FSBL17,GBWD∗18]), or axes of custom projections [BCZ∗16a, BCZ∗16b]. Alternatively, some papers show a 2D grid of reconstructed examples [DTD∗18, JBJ18, ZSE17, KPHL17].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…The research literature involving latent spaces often employs visualizations for qualitative evaluation. Most of these visualizations are 2D scatter plots, using two intrinsic latent dimensions directly [Wet17, MNG17, YHSBK17], axes from dimensionality reduction techniques such as t‐SNE ( e.g ., [TWR∗17, JYY∗16, HSSQ17, MKS∗15, YSD∗18]) or PCA ( e.g ., [SRM∗16, MSSW16,UFDR16,FSBL17,GBWD∗18]), or axes of custom projections [BCZ∗16a, BCZ∗16b]. Alternatively, some papers show a 2D grid of reconstructed examples [DTD∗18, JBJ18, ZSE17, KPHL17].…”
Section: Related Workmentioning
confidence: 99%
“…The research literature involving latent spaces often employs visualizations for qualitative evaluation. Most of these visualizations are 2D scatter plots, using two intrinsic latent dimensions directly [Wet17,MNG17,YHSBK17], axes from dimen- Researchers have also developed interactive visual analysis tools for latent spaces [STN * 16, JSL * 17, HG18, LBT * 18, LNH * 18]. Some tools focus on a subset of tasks [STN * 16, LBT * 18] in word embeddings, which we extend and bring to a broader range of latent spaces.…”
Section: Visualizing Latent Spacesmentioning
confidence: 99%
“…The former is important in atomic and molecular physics, as well as quantum chemistry. The latter is related to machine learning phases of matter, which has already become a topic of intensive interest [1][2][3][4][5][6][7][8][9][10][11][12][13][14][15][16][17][18][19][20]. Our understanding of how neural network works in this simple problem can shed light on understanding more sophisticated problems better.…”
Section: Consider the Schrödinger Equation In The Relative Framementioning
confidence: 99%
“…Machine learning is emerging as a novel tool for identifying phases of matter [1][2][3][4][5][6][7][8][9][10][11][12][13][14][15]. At its core, this problem can be cast as a classification problem in which data obtained from physical systems are assigned a class (i.e., a phase) using machine learning methods.…”
mentioning
confidence: 99%
“…At its core, this problem can be cast as a classification problem in which data obtained from physical systems are assigned a class (i.e., a phase) using machine learning methods. This approach has enabled the autonomous detection of order parameters [2,5,6], phase transitions [1,3], and entire phase diagrams [4,7,16,17]. Simultaneous research effort at the interface between machine learning and many-body physics has focused on the use of neural networks for efficient representations of quantum wave functions [18][19][20][21][22][23][24][25][26], drawing a parallel between deep networks and the renormalization group [27][28][29].…”
mentioning
confidence: 99%