2023
DOI: 10.1103/physrevb.107.075147
|View full text |Cite
|
Sign up to set email alerts
|

Transformer quantum state: A multipurpose model for quantum many-body problems

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 14 publications
(2 citation statements)
references
References 33 publications
0
2
0
Order By: Relevance
“…In the past years, extension of transformers to image processing (vision transformers) has proven itself to be comparably powerful to convolutions [73], opening possible routes toward their application in many-body physics [74][75][76]. The architecture of a vision transformer is schematically illustrated in Fig.…”
Section: Transformer Encodermentioning
confidence: 99%
“…In the past years, extension of transformers to image processing (vision transformers) has proven itself to be comparably powerful to convolutions [73], opening possible routes toward their application in many-body physics [74][75][76]. The architecture of a vision transformer is schematically illustrated in Fig.…”
Section: Transformer Encodermentioning
confidence: 99%
“…Our approach is inspired by the success of foundation models in language 22 or vision 23 , 24 , where models are extensively pre-trained and then applied to new tasks—either without any subsequent training (referred to as zero-shot evaluation) or after small amount of training on the new task (referred to as fine-tuning). Zhang et al 25 have shown that this paradigm can be successfully applied to wavefunctions, in their case for model Hamiltonians in second quantization.…”
Section: Introductionmentioning
confidence: 99%