Optimization problem, which is aimed at finding the global minimal value of a given cost function, is one of the central problem in science and engineering. Various numerical methods have been proposed to solve this problem, among which the Gradient Descent (GD) method is the most popular one due to its simplicity and efficiency. However, the GD method suffers from two main issues: the local minima and the slow convergence especially near the minima point. The Natural Gradient Descent(NGD), which has been proved as one of the most powerful method for various optimization problems in machine learning, tensor network, variational quantum algorithms and so on, supplies an efficient way to accelerate the convergence. Here, we give a unified method to extend the NGD method to a more general situation which keeps the fast convergence by looking for a more suitable metric through introducing a 'proper' reference Riemannian manifold. Our method generalizes the NDG, and may give more insight of the optimization methods.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.