2020
DOI: 10.1002/qute.202000133
|View full text |Cite
|
Sign up to set email alerts
|

Quantum Semantic Learning by Reverse Annealing of an Adiabatic Quantum Computer

Abstract: Restricted Boltzmann machines (RBMs) constitute a class of neural networks for unsupervised learning with applications ranging from pattern classification to quantum state reconstruction. Despite the potential representative power, the diffusion of RBMs is quite limited since their training process proves to be hard. The advent of commercial adiabatic quantum computers (AQCs) raised the expectation that the implementations of RBMs on such quantum devices can increase the training speed with respect to conventi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
26
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2
1

Relationship

4
4

Authors

Journals

citations
Cited by 39 publications
(26 citation statements)
references
References 33 publications
0
26
0
Order By: Relevance
“…Within the field of quantum machine learning (QML) [6,7], if one neglects the implementation of quantum neural networks on adiabatic quantum computers [8], there are essentially two kind of proposals of quantum neural networks on a gate-model quantum computer. The first consists of defining a quantum neural network as a variational quantum circuit composed of parameterized gates, where non-linearity is introduced by measurements operations [9][10][11].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Within the field of quantum machine learning (QML) [6,7], if one neglects the implementation of quantum neural networks on adiabatic quantum computers [8], there are essentially two kind of proposals of quantum neural networks on a gate-model quantum computer. The first consists of defining a quantum neural network as a variational quantum circuit composed of parameterized gates, where non-linearity is introduced by measurements operations [9][10][11].…”
Section: Introductionmentioning
confidence: 99%
“…Instead, differently from both the above proposals and from quantum annealing based algorithms applied to neural networks [8], we develop a fully reversible algorithm.…”
Section: Introductionmentioning
confidence: 99%
“…Dymtro et al [26] performed a benchmarking study, to show that for harder problems Boltzmann machines trained using quantum annealing gives better gradients as compared to CD. Lorenzo et al [27] used RBM trained with reverse annealing to carry out semantic learning that achieved good scores on reconstruction tasks. Koshka et al [28] showed D-Wave quantum annealing performed better than classical simulated annealing for RBM training when the number of local valleys on the energy landscape was large.…”
Section: Introductionmentioning
confidence: 99%
“…The term epoch means a full cycle of iterations, with each training pattern participating only once. Accuracy is defined as:AccuracyNumber of correct predictions Total number of predictions(27)…”
mentioning
confidence: 99%
“…Moreover, one can also iterate the process, leading to a strategy known as iterated reverse annealing [12,26]. We note that there is some evidence that reverse annealing can outperform forward annealing in solving optimization problems [26][27][28][29][30][31][32][33]. However, our focus in this work is not on algorithmic performance, but rather on using the rich playground provided by the p = 2 p-spin model under reverse annealing to answer the question of which of a variety of models best describes the results obtained from the D-Wave annealer.…”
Section: Introductionmentioning
confidence: 99%