2020
DOI: 10.1103/physrevlett.125.121601
|View full text |Cite
|
Sign up to set email alerts
|

Equivariant Flow-Based Sampling for Lattice Gauge Theory

Abstract: We define a class of machine-learned flow-based sampling algorithms for lattice gauge theories that are gauge invariant by construction. We demonstrate the application of this framework to U(1) gauge theory in two spacetime dimensions, and find that, at small bare coupling, the approach is orders of magnitude more efficient at sampling topological quantities than more traditional sampling procedures such as hybrid Monte Carlo and heat bath.

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
197
0
13

Year Published

2020
2020
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 180 publications
(212 citation statements)
references
References 54 publications
2
197
0
13
Order By: Relevance
“…As we have shown, it successfully improves the problem of topology freezing and exponentially-growing autocorrelation times in the 2D model considered -both with and without fermion content. The integrated autocorrelation time of wHMC in the pure gauge case is very similar to the one obtained in machine-learned flow-based sampling algorithms [14,15], however without the additional training cost.…”
Section: Discussionsupporting
confidence: 63%
See 1 more Smart Citation
“…As we have shown, it successfully improves the problem of topology freezing and exponentially-growing autocorrelation times in the 2D model considered -both with and without fermion content. The integrated autocorrelation time of wHMC in the pure gauge case is very similar to the one obtained in machine-learned flow-based sampling algorithms [14,15], however without the additional training cost.…”
Section: Discussionsupporting
confidence: 63%
“…We will test our algorithm in the U (1) gauge theory in 2D with and without fermion content. This model has been recently used as benchmark in machine-learned flowbased sampling algorithms [14,15], as well as in tensor network approaches [16,17] (see Ref. [18] for a review).…”
Section: Introductionmentioning
confidence: 99%
“…Normalizing flows allow for the latent space to be irregular and not necessarily Gaussian. While the exact deep learning architecture is not critical, we focus on normalizing flow since it has been shown as a powerful tool for representing physical models [11,[35][36][37][38][39][40][41][42][43][44][45]. In the QUAK construction, our signal choices can be thought of as "approximate-priors" since they help direct the space of searches towards signals with similar features.…”
Section: Conceptmentioning
confidence: 99%
“…3 In fact, it is somewhat surprising that autoencoders appear to perform worse on the nominally easier task of a bump hunt in leptons than on the superficially much more complicated task of jet image recognition and classification, since leptons live on a phase space of fixed dimension. The increasing prominence of "physics-inspired neural networks" -where networks with important symmetry principles (such as gauge equivariance and Lorentz symmetry) hard-coded into the network architecture perform better than networks which are forced to learn these principles from scratch [54][55][56] -suggests that knowledge of the topology may in fact be necessary to appropriately interpret the autoencoder performance. We illustrate this point with the low-dimensional examples described above, and speculate on how these principles might be applied in the context of phase space.…”
Section: Jhep04(2021)280mentioning
confidence: 99%