Neural networks can be used to identify phases and phase transitions in condensed matter systems via supervised machine learning. Readily programmable through modern software libraries, we show that a standard feed-forward neural network can be trained to detect multiple types of order parameter directly from raw state configurations sampled with Monte Carlo. In addition, they can detect highly nontrivial states such as Coulomb phases, and if modified to a convolutional neural network, topological phases with no conventional order parameter. We show that this classification occurs within the neural network without knowledge of the Hamiltonian or even the general locality of interactions. These results demonstrate the power of machine learning as a basic research tool in the field of condensed matter and statistical physics. arXiv:1605.01735v1 [cond-mat.str-el] 5 May 20162 Condensed matter physics is the study of the collective behavior of massively complex assemblies of electrons, nuclei, magnetic moments, atoms or qubits [1]. This complexity is reflected in the size of the classical or quantum state space, which grows exponentially with the number of particles. This exponential growth is reminiscent of the "curse of dimensionality" commonly encountered in machine learning. That is, a target function to be learned requires an amount of training data that grows exponentially in the dimension (e.g. the number of image features). Despite this curse, the machine learning community has developed a number of techniques with remarkable abilities to recognize, classify, and characterize complex sets of data. In light of this success, it is natural to ask whether such techniques could be applied to the arena of condensed-matter physics, particularly in cases where the microscopic Hamiltonian contains strong interactions, where numerical simulations are typically employed in the study of phases and phase transitions [2,3]. We demonstrate that modern machine learning architectures, such as fully-connected and convolutional neural networks [4], can provide a complementary approach to identifying phases and phase transitions in a variety of systems in condensed matter physics. The training of neural networks on data sets obtained by Monte Carlo sampling provides a particularly powerful and simple framework for the supervised learning of phases and phase boundaries in physical models, and can be easily built from readily-available tools such as Theano [5] or TensorFlow [6] libraries.Conventionally, the study of phases in condensed matter systems is performed with the help of tools that have been carefully designed to elucidate the underlying physical structures of various states. Among the most powerful are Monte Carlo simulations, which consist of two steps: a stochastic importance sampling over state space, and the evaluation of estimators for physical quantities calculated from these samples [3]. These estimators are constructed based on a variety of physical impetuses; e.g. the ready availability of an analogous experi...
We develop a quantum Monte Carlo procedure, in the valence bond basis, to measure the Renyi entanglement entropy of a many-body ground state as the expectation value of a unitary Swap operator acting on two copies of the system. An improved estimator involving the ratio of Swap operators for different subregions enables convergence of the entropy in a simulation time polynomial in the system size. We demonstrate convergence of the Renyi entropy to exact results for a Heisenberg chain. Finally, we calculate the scaling of the Renyi entropy in the two-dimensional Heisenberg model and confirm that the Néel ground state obeys the expected area law for systems up to linear size L=32.
Inspired by the success of Boltzmann machines based on classical Boltzmann distribution, we propose a new machine-learning approach based on quantum Boltzmann distribution of a quantum Hamiltonian. Because of the noncommutative nature of quantum mechanics, the training process of the quantum Boltzmann machine (QBM) can become nontrivial. We circumvent the problem by introducing bounds on the quantum probabilities. This allows us to train the QBM efficiently by sampling. We show examples of QBM training with and without the bound, using exact diagonalization, and compare the results with classical Boltzmann training. We also discuss the possibility of using quantum annealing processors for QBM training and application.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.