2021
DOI: 10.48550/arxiv.2111.07952
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Stochastic Gradient Line Bayesian Optimization: Reducing Measurement Shots in Optimizing Parameterized Quantum Circuits

Shiro Tamiya,
Hayata Yamasaki
Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 64 publications
(129 reference statements)
0
4
0
Order By: Relevance
“…Note added -In independent recent work 50 another optimisation algorithm, called SGSLBO (Stochastic Gradient Line Bayesian Optimization), is proposed for VQE that is at first glance similar to ours. However, this algorithm is based on the use of stochastic gradient descent to determine the gradient direction paired with Bayesian optimisation for a line search along the gradient direction.…”
Section: Appendix D: Variational Optimisermentioning
confidence: 84%
“…Note added -In independent recent work 50 another optimisation algorithm, called SGSLBO (Stochastic Gradient Line Bayesian Optimization), is proposed for VQE that is at first glance similar to ours. However, this algorithm is based on the use of stochastic gradient descent to determine the gradient direction paired with Bayesian optimisation for a line search along the gradient direction.…”
Section: Appendix D: Variational Optimisermentioning
confidence: 84%
“…2 In the quantum computing setting, adopting an algorithm like SHOALS translates to significantly lower total time, since certain large parts of the latency costs are paid per iteration, rather than per shot, at the expense of potentially higher (but dynamic) numbers of shots. We note that a recent paper on another classical optimizer for VQAs employs a Bayesian line search [39], which is fundamentally different from the line search considered in this paper.…”
Section: Our Contributionsmentioning
confidence: 93%
“…Convergence to a local minimum can be guaranteed by the use of backtracking, where η is steadily decreased during the course of the calculation [133]. Alternatively, the step size can be selected by using Bayesian optimization to perform a line search [134].…”
Section: Optimizationmentioning
confidence: 99%