The gradient descent method is central to numerical optimization and is the key ingredient in many machine learning algorithms. It promises to find a local minimum of a function by iteratively moving along the direction of the steepest descent. Since for high-dimensional problems the required computational resources can be prohibitive, it is desirable to investigate quantum versions of the gradient descent, such as the recently proposed (Rebentrost et al.1). Here, we develop this protocol and implement it on a quantum processor with limited resources. A prototypical experiment is shown with a four-qubit nuclear magnetic resonance quantum processor, which demonstrates the iterative optimization process. Experimentally, the final point converged to the local minimum with a fidelity >94%, quantified via full-state tomography. Moreover, our method can be employed to a multidimensional scaling problem, showing the potential to outperform its classical counterparts. Considering the ongoing efforts in quantum information and data science, our work may provide a faster approach to solving high-dimensional optimization problems and a subroutine for future practical quantum computers.
Quantum optimization algorithms can outperform their classical counterpart and are key in modern technology. The second-order optimization algorithm (the Newton algorithm) is a critical optimization method, speeding up the convergence by employing the second-order derivative of loss functions in addition to their first derivative. Here, we propose a new quantum second-order optimization algorithm for general polynomials with a computational complexity of O(poly(log d)). We use this algorithm to solve the nonlinear equation and learning parameter problems in factorization machines. Numerical simulations show that our new algorithm is faster than its classical counterpart and the first-order quantum gradient descent algorithm. While existing quantum Newton optimization algorithms apply only to homogeneous polynomials, our new algorithm can be used in the case of general polynomials, which are more widely present in real applications.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.