2023
DOI: 10.1103/physrevb.107.195115
|View full text |Cite
|
Sign up to set email alerts
|

Optimizing design choices for neural quantum states

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
7
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 18 publications
(7 citation statements)
references
References 54 publications
0
7
0
Order By: Relevance
“…Unfortunately, the performance depends on the symmetrization technique (see, e.g. [21]); therefore, careful consideration of the symmetrization is necessary to obtain a good performance 6 . • Combination.…”
Section: 321mentioning
confidence: 99%
See 1 more Smart Citation
“…Unfortunately, the performance depends on the symmetrization technique (see, e.g. [21]); therefore, careful consideration of the symmetrization is necessary to obtain a good performance 6 . • Combination.…”
Section: 321mentioning
confidence: 99%
“…Since then, extensions and improvements of the artificial neural network method have been continuously pursued all over the world. Currently, its applications have been extended to simulations of quantum spin systems with geometrical frustration [9][10][11][12][13][14][15][16][17][18][19][20][21][22], itinerant boson systems [23,24], fermion systems [9,[25][26][27][28][29][30][31][32][33][34][35], fermion-boson coupled systems [36], topologically nontrivial quantum states [5,6,[37][38][39][40][41], excited states [15,32,36,[42][43][44], real-time evolution [7,45,46], open quantum systems [47][48]…”
Section: Introductionmentioning
confidence: 99%
“…years, machine learning methods [29,30] have provided a complementary strategy to rationalize phases of matter, often in combination with conventional quantum many-body methods. The demonstrations of these roles played by machine learning methods in tackling many-body problems results in characterizing different phases of matter [31][32][33][34][35][36][37][38][39][40], deep learning of the quantum dynamics [41][42][43][44], obtaining many-body wave functions [45][46][47][48][49], and optimizing the performance of computational simulations [50].…”
Section: Introductionmentioning
confidence: 99%
“…To consider an architecture scalable, both the number of its parameters and the network evaluation cost should at most grow mildly with the system size 𝑁 . [32] The RBMs and its variants violate the first requirement because of a given constant ratio 𝛼 between the number of hidden and visible neurons, the number of parameters grows quadratically. [30] RNNs violate the second requirement with its quadratic scaling computing cost, as the cost of evaluating an unsymmetrized configuration grows linearly with the system size 𝑁 , and the symmetrization gives another factor 𝑁 .…”
mentioning
confidence: 99%
“…[30] RNNs violate the second requirement with its quadratic scaling computing cost, as the cost of evaluating an unsymmetrized configuration grows linearly with the system size 𝑁 , and the symmetrization gives another factor 𝑁 . [32] Meanwhile, the autoregressive property imposes specific constraints on the architecture of RNNs, thereby restricting their expressiveness. [33] Recently proposed transformer-based wavefunctions also come at a significant computational cost due to their quadratic complexity with respect to the input sequence length.…”
mentioning
confidence: 99%