2017
DOI: 10.1103/physreva.96.062327
|View full text |Cite
|
Sign up to set email alerts
|

Tomography and generative training with quantum Boltzmann machines

Abstract: The promise of quantum neural nets, which utilize quantum effects to model complex data sets, has made their development an aspirational goal for quantum machine learning and quantum computing in general. Here we provide new methods of training quantum Boltzmann machines, which are a class of recurrent quantum neural network. Our work generalizes existing methods and provides new approaches for training quantum neural networks that compare favorably to existing methods. We further demonstrate that quantum Bolt… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

6
173
0
2

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 161 publications
(181 citation statements)
references
References 12 publications
6
173
0
2
Order By: Relevance
“…For classical data sets training a model with hidden variables, the objective function of Ref. [43] involves a computationally difficult trace, in contrast to our bound [45]. Thus, the two approaches offer complementary strategies for the unsupervised learning of classical and quantum data distributions using quantum Boltzmann machines.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…For classical data sets training a model with hidden variables, the objective function of Ref. [43] involves a computationally difficult trace, in contrast to our bound [45]. Thus, the two approaches offer complementary strategies for the unsupervised learning of classical and quantum data distributions using quantum Boltzmann machines.…”
Section: Discussionmentioning
confidence: 99%
“…Whereas the present paper focuses only on learning classical probability distributions, Ref. [43] provides a strategy to also learn a quantum distribution, including the ability to train the transverse field. In the case of classical data sets, when the model is fully visible, the objective function is exactly the same as the bound introduced in this paper [44].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…For example, the quantum principal component analysis [2] requires universal fault-tolerant hardware in order to implement the necessary SWAP operations. As another example, the quantum Boltzmann machine [3,4] requires the preparation of highly non-trivial thermal states. Moreover, those approaches provide limited control over the level of approximation.…”
Section: Resultsmentioning
confidence: 99%
“…This process of approximately reconstructing a quantum state is already known to physicists under the name of quantum state tomography. Indeed, there already exist proposals of generative models for tomography such as the quantum principal component analysis [2] and the quantum Boltzmann machine [3,4]. Other machine learning approaches for tomography have been formulated using the different framework of probably approximately correct learning [5,6].…”
Section: Introductionmentioning
confidence: 99%