2021
DOI: 10.48550/arxiv.2112.02572
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Riemannian conjugate gradient methods: General framework and specific algorithms with convergence analyses

Abstract: This paper proposes a novel general framework of Riemannian conjugate gradient methods, that is, conjugate gradient methods on Riemannian manifolds. The conjugate gradient methods are important first-order optimization algorithms both in Euclidean spaces and on Riemannian manifolds. While various types of conjugate gradient methods are studied in Euclidean spaces, there have been fewer studies on those on Riemannian manifolds. In each iteration of the Riemannian conjugate gradient methods, the previous search … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
9
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(9 citation statements)
references
References 29 publications
0
9
0
Order By: Relevance
“…We also highlight that most, if not all, types of conjugate gradient methods satisfy the conditions in Theorem 6. See more discussions in [64]. As an example, consider the Fletcher-Reeves-type CG [20] For some threshold ρ ∈ [0, 1/4), the step is only accepted when ρ t > ρ .…”
Section: F1 Rhm With Conjugate Gradient (Rhm-cg)mentioning
confidence: 99%
“…We also highlight that most, if not all, types of conjugate gradient methods satisfy the conditions in Theorem 6. See more discussions in [64]. As an example, consider the Fletcher-Reeves-type CG [20] For some threshold ρ ∈ [0, 1/4), the step is only accepted when ρ t > ρ .…”
Section: F1 Rhm With Conjugate Gradient (Rhm-cg)mentioning
confidence: 99%
“…Furthermore ∇ Ĵ(X 0 ) = 0 now provides a necessary condition for a local minimiser [1]. What remains, following [34,2], is to appropriately adapt the existing line-search algorithms such that updating X k 0 leads to global convergence. Aside from the Pymanopt library for optimisation on matrix manifolds [35], SLSQP [36] presents the only alternative for problems with equality constraints.…”
Section: Issue 2: Constrained Gradient Updatementioning
confidence: 99%
“…Owing to the construction, storage O(n 2 ) and inversion O(n 3 ) of the hessian matrix SLSQP is however numerically too demanding to handle high-dimension control variables [17,37]. Conversely while Pymanopt implements computationally efficient methods, its linesearch does not satisfy the necessary conditions in order to guarantee global convergence [34].…”
Section: Issue 2: Constrained Gradient Updatementioning
confidence: 99%
See 2 more Smart Citations