2022
DOI: 10.1103/physrevlett.128.032003
|View full text |Cite
|
Sign up to set email alerts
|

Lattice Gauge Equivariant Convolutional Neural Networks

Abstract: We propose Lattice gauge equivariant Convolutional Neural Networks (L-CNNs) for generic machine learning applications on lattice gauge theoretical problems. At the heart of this network structure is a novel convolutional layer that preserves gauge equivariance while forming arbitrarily shaped Wilson loops in successive bilinear layers. Together with topological information, for example from Polyakov loops, such a network can in principle approximate any gauge covariant function on the lattice. We demonstrate t… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
38
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 46 publications
(38 citation statements)
references
References 80 publications
0
38
0
Order By: Relevance
“…We apply our method to a heuristic atmospheric weather model using using the NG-RC [13] as the core learning machine, which reduces the computational complexity and the data set size required for training while displaying state-of-the-art accuracy. Accounting for the translational symmetry of the weather model further reduces the training time and data, highlighting the importance of addressing symmetries [17][18][19][20][21]. Figure 1 illustrates our scheme.…”
mentioning
confidence: 99%
“…We apply our method to a heuristic atmospheric weather model using using the NG-RC [13] as the core learning machine, which reduces the computational complexity and the data set size required for training while displaying state-of-the-art accuracy. Accounting for the translational symmetry of the weather model further reduces the training time and data, highlighting the importance of addressing symmetries [17][18][19][20][21]. Figure 1 illustrates our scheme.…”
mentioning
confidence: 99%
“…ML methods are being developed to speed up all three steps in such calculations [35]: (i) generation of gauge configurations (see Sec. 6), (ii) measurement of correlation functions [36,37,38,39] (see example below), and (iii) their analysis to extract physics [40,41,42]. The first two steps in LQCD, and ML systems, are computational and are therefore labeled "black boxes".…”
Section: Lattice Qcdmentioning
confidence: 99%
“…In ML too, one can impose symmetries at the level of the model (a penalty for models that break it or build it into the tuning of weights) or data (data representation itself builds in the symmetry or perform data augmentation to realize the symmetry). Ongoing work suggests that incorporating an understanding of the science or the dynamics of the system into ML is likely to dramatically increase its power [36,37,22,48].…”
Section: Contrasting Two Black Boxes: Lattice Qcd and MLmentioning
confidence: 99%
“…Observable measurement -Quantities like correlation functions are evaluated over ensembles of field configurations. ML applications thus far include novel methods to extract thermodynamic observables [44], action parameter regression [67], observable approximation [68][69][70], design of new observables [58,[71][72][73][74][75][76][77][78][79][80], and path-integral contour deformations for baryonic correlators [81].…”
Section: Introductionmentioning
confidence: 99%