2019
DOI: 10.1103/physrevd.100.011501
|View full text |Cite
|
Sign up to set email alerts
|

Regressive and generative neural networks for scalar field theory

Abstract: We explore the perspectives of machine learning techniques in the context of quantum field theories. In particular, we discuss two-dimensional complex scalar field theory at nonzero temperature and chemical potential -a theory with a nontrivial phase diagram. A neural network is successfully trained to recognize the different phases of this system and to predict the value of various observables, based on the field configurations. We analyze a broad range of chemical potentials and find that the network is robu… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

1
77
0
1

Year Published

2019
2019
2024
2024

Publication Types

Select...
10

Relationship

3
7

Authors

Journals

citations
Cited by 107 publications
(79 citation statements)
references
References 44 publications
1
77
0
1
Order By: Relevance
“…These have inspired physicists to adopt the technique to tackle physical problems of great complexity. A lot of progresses have been made in nuclear physics [39][40][41][42][43][44][45], lattice field theory [46][47][48][49][50], particle physics [51][52][53][54][55], astrophysics [56][57][58] and condensed matter physics [59][60][61][62][63][64][65].…”
Section: Introductionmentioning
confidence: 99%
“…These have inspired physicists to adopt the technique to tackle physical problems of great complexity. A lot of progresses have been made in nuclear physics [39][40][41][42][43][44][45], lattice field theory [46][47][48][49][50], particle physics [51][52][53][54][55], astrophysics [56][57][58] and condensed matter physics [59][60][61][62][63][64][65].…”
Section: Introductionmentioning
confidence: 99%
“…Recently, there has been progress in the development of flow-based generative models which can be trained to directly produce samples from a given probability distribution; early success has been demonstrated in theories of bosonic matter, spin systems, molecular systems, and for Brownian motion [24][25][26][27][28][29][30][31][32][33][34]. This progress builds on the great success of flow-based approaches for image, text, and structured object generation [35][36][37][38][39][40][41][42], as well as non-flow-based machine learning techniques applied to sampling for physics [43][44][45][46][47][48]. If flow-based algorithms can be designed and implemented at the scale of state-of-the-art calculations, they would enable efficient sampling in lattice theories that are currently hindered by CSD.…”
mentioning
confidence: 99%
“…In other words, the neural network learns how to predict a required feature of a complex system and then uses the acquired knowledge to make the predictions independently. Given the impressive versatility of the approach, ML methods find their implementations in studies of the phase structure of various many-body systems, strongly correlated environments, and field theories [17][18][19][20][21][22][23][24][25][26][27][28][29].…”
Section: Introductionmentioning
confidence: 99%