2018
DOI: 10.1038/s41598-018-28212-4
|View full text |Cite
|
Sign up to set email alerts
|

Optimization of neural networks via finite-value quantum fluctuations

Abstract: We numerically test an optimization method for deep neural networks (DNNs) using quantum fluctuations inspired by quantum annealing. For efficient optimization, our method utilizes the quantum tunneling effect beyond the potential barriers. The path integral formulation of the DNN optimization generates an attracting force to simulate the quantum tunneling effect. In the standard quantum annealing method, the quantum fluctuations will vanish at the last stage of optimization. In this study, we propose a learni… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
7
1
1

Relationship

2
7

Authors

Journals

citations
Cited by 29 publications
(18 citation statements)
references
References 33 publications
0
18
0
Order By: Relevance
“…Generalization performance, which is the measure of potential power in machine learning but not directly related to the cost function to be optimized, can be enhanced via stochastic methods to optimize cost functions. In particular, QA actually leads to better generalization performance, as shown in literature 43 .…”
Section: Other Solvers and Comparison Datamentioning
confidence: 61%
“…Generalization performance, which is the measure of potential power in machine learning but not directly related to the cost function to be optimized, can be enhanced via stochastic methods to optimize cost functions. In particular, QA actually leads to better generalization performance, as shown in literature 43 .…”
Section: Other Solvers and Comparison Datamentioning
confidence: 61%
“…Actually in this parameter range, the quantum annealing gives better optimization results than the simulated annealing. On the other hand, regardless of problem to be solved, the quantum annealing generally contains stochastic fluctuations in the solutions 28,32 . When the parameter α is in the intermediate range where the difficulty inherent in the optimization problem is moderate, both the simulated annealing and the quantum annealing give high quality solutions, but the simulated annealing gives slightly better solutions than the quantum annealing because the relative strength of stochastic fluctuations is large.…”
Section: Resultsmentioning
confidence: 99%
“…In the previous studies, the previous studies on the classical Boltzmann machine learning by use of the quantum annealer reported better performance than the standard methods in classical computers 43 . In theoretical aspect, the small quantum fluctuation improves the performance on prediction as shown in several studies 44 , 45 .…”
Section: Introductionmentioning
confidence: 82%