2022
DOI: 10.1103/physrevlett.129.136402
|View full text |Cite
|
Sign up to set email alerts
|

Deep Learning the Functional Renormalization Group

Abstract: We perform a data-driven dimensionality reduction of the scale-dependent four-point vertex function characterizing the functional renormalization group (FRG) flow for the widely studied two-dimensional t-t 0 Hubbard model on the square lattice. We demonstrate that a deep learning architecture based on a neural ordinary differential equation solver in a low-dimensional latent space efficiently learns the FRG dynamics that delineates the various magnetic and d-wave superconducting regimes of the Hubbard model. W… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0
1

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 18 publications
(13 citation statements)
references
References 69 publications
0
12
0
1
Order By: Relevance
“…However, to better match the teacher model p tch (x), the ⟨x 0 x 2 ⟩ and ⟨x 1 x 3 ⟩ correlations should be further weakened relative to others because there is no direct coupling between these diagonal spins in the original teacher model equation (11). This is where the E representation plays a role, as the z 2 0 and z 3 0 fluctuations will mediate some negative correlation in ⟨x 0 x 2 ⟩ and ⟨x 1 x 3 ⟩ due to the minus signs in equation (12), which helps to bring the student model's visible distribution p std (x) closer to that of the teacher model p tch (x). This argument explains why including more point group representations to the hidden spin in the student model is generally helpful to improve its expressive power.…”
Section: Choosing Point Group Representationsmentioning
confidence: 99%
See 1 more Smart Citation
“…However, to better match the teacher model p tch (x), the ⟨x 0 x 2 ⟩ and ⟨x 1 x 3 ⟩ correlations should be further weakened relative to others because there is no direct coupling between these diagonal spins in the original teacher model equation (11). This is where the E representation plays a role, as the z 2 0 and z 3 0 fluctuations will mediate some negative correlation in ⟨x 0 x 2 ⟩ and ⟨x 1 x 3 ⟩ due to the minus signs in equation (12), which helps to bring the student model's visible distribution p std (x) closer to that of the teacher model p tch (x). This argument explains why including more point group representations to the hidden spin in the student model is generally helpful to improve its expressive power.…”
Section: Choosing Point Group Representationsmentioning
confidence: 99%
“…Prior research has demonstrated that neural networks can learn to perform hierarchical feature extraction at the configuration level [1][2][3][4][5][6][7][8][9][10][11]. However, a more fascinating aspect of RG is its ability to quantitatively analyze the flow of physics theory in the parameter space at the model level [12,13]. Therefore, this research aims to develop a novel machine-learning RG (MLRG) method that can automatically formulate RG flow equations, discover RG monotones, propose effective theories, identify critical points, and estimate critical exponents, all starting from the symmetry and dimension of the physical system.…”
Section: Introductionmentioning
confidence: 99%
“…(1) i (10, 6) on each of which the distillation process is performed and 10 distilled images are obtained, and then we obtain the distilled dataset D (2) (10, 100) which contains 100 images for each class at the second level. Repeat this divideand-conquer strategy, we can obtain dataset D (3) (10, 20), D (4) (10,4), and finally reach the highest distilled dataset D (5) (10, 1), which contains only a single distilled image for each class and can be regarded as the typical representatives hosting the essential features of that class. The whole process is illustrated in Fig.…”
Section: Taylornetmentioning
confidence: 99%
“…In the past decade, machine learning has drawn great attention from almost all natural science and engineering communities, such as mathematics [1][2][3], physics [4][5][6][7][8][9][10], biology [11][12][13], and materials sciences [14][15][16], and has been widely used in various aspects of modern society, e.g., automatic driving systems, face recognition, fraud detection, expert recommendation system, speech enhancement, and natural language processing, etc. Especially, the deep learning techniques based on the artificial neural networks [17,18] have become the most popular and dominant machine learning approaches progressively, and their interactions with many-body physics have been intensively explored in recent years.…”
Section: Introductionmentioning
confidence: 99%
“…[ 3 ] As a data‐driven method, Deep Learning can learn from the data, and solve problems with former mastered knowledge. Recently, more and more Deep Learning methods were introduced to physics research, [ 4 ] which provides a new perspective for researchers. In the photonic devices design area, as on‐demand design of chiral metamaterials, [ 5 ] plasmonic nanostructure design, [ 6 ] and inverse design of Fabry‐Perot‐cavity‐based color filters have been proposed.…”
Section: Introductionmentioning
confidence: 99%