2020
DOI: 10.1103/physrevx.10.021020
|View full text |Cite
|
Sign up to set email alerts
|

Neural Canonical Transformation with Symplectic Flows

Abstract: Canonical transformation plays a fundamental role in simplifying and solving classical Hamiltonian systems. We construct flexible and powerful canonical transformations as generative models using symplectic neural networks. The model transforms physical variables towards a latent representation with an independent harmonic oscillator Hamiltonian. Correspondingly, the phase space density of the physical system flows towards a factorized Gaussian distribution in the latent space. Since the canonical transformati… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
25
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
2

Relationship

2
7

Authors

Journals

citations
Cited by 29 publications
(30 citation statements)
references
References 74 publications
0
25
0
Order By: Relevance
“…This is true, in particular, for the lattice formulation of quantum chromodynamics (QCD) [21][22][23], which enables calculations of nonperturbative phenomena arising from the standard model of particle physics. Recently, there has been progress in the development of flow-based generative models which can be trained to directly produce samples from a given probability distribution; early success has been demonstrated in theories of bosonic matter, spin systems, molecular systems, and for Brownian motion [24][25][26][27][28][29][30][31][32][33][34]. This progress builds on the great success of flow-based approaches for image, text, and structured object generation [35][36][37][38][39][40][41][42], as well as non-flow-based machine learning techniques applied to sampling for physics [43][44][45][46][47][48].…”
mentioning
confidence: 99%
“…This is true, in particular, for the lattice formulation of quantum chromodynamics (QCD) [21][22][23], which enables calculations of nonperturbative phenomena arising from the standard model of particle physics. Recently, there has been progress in the development of flow-based generative models which can be trained to directly produce samples from a given probability distribution; early success has been demonstrated in theories of bosonic matter, spin systems, molecular systems, and for Brownian motion [24][25][26][27][28][29][30][31][32][33][34]. This progress builds on the great success of flow-based approaches for image, text, and structured object generation [35][36][37][38][39][40][41][42], as well as non-flow-based machine learning techniques applied to sampling for physics [43][44][45][46][47][48].…”
mentioning
confidence: 99%
“…Parametrizing and optimizing a rich family of equivariant many-body unitary transformations U turn out to be a fairly nontrivial task. In this paper, we present an elegant solution to this problem by constructing U as unitary representation of the canonical transformation of phase space variables in classical mechanics, extending the previous work on neural canonical transformations [14] from classical to quantum domain. The resulting approach naturally generalizes the ground-state variational Monte Carlo (VMC) method [15,16] to finite temperatures and is not hindered by the fermion sign problem.…”
Section: Introductionmentioning
confidence: 99%
“…However, they still demand advances in quantum technologies to be practically useful. Third, variational free energy studies of statistical mechanics and field theory problems [14,[35][36][37][38][39][40] can be regarded as the classical counterparts of the present approach. Last but not least, the so-called quantum flow approach [41] performs similar unitary transformation to a single-particle basis.…”
Section: Introductionmentioning
confidence: 99%
“…Normalizing flows, on the other hand, are a family of non-linear invertible deep learning models (Dinh et al, 2015;Kingma & Dhariwal, 2018;Chen et al, 2018;Li et al, 2020;Li & Wang, 2018). Their applications cover many topics, including image generation (Kingma & Dhariwal, 2018;Dinh et al, 2016), independent component analysis (ICA) (Dinh et al, 2015;Sorrenson et al, 2020), variational inference (Kingma et al, 2016b;Rezende & Mohamed, 2015), Monte Carlo sampling (Song et al, 2017), and scientific applications (Li et al, 2020;Li & Wang, 2018;Hu et al, 2020a;Noé et al, 2019). Normalizing flows are a promising candidate for learnable wavelet transformation.…”
Section: Introductionmentioning
confidence: 99%