2008
DOI: 10.1515/9781400830244
|View full text |Cite
|
Sign up to set email alerts
|

Optimization Algorithms on Matrix Manifolds

Abstract: 3.4 Quotient manifolds 3.4.1 Theory of quotient manifolds 3.4.2 Functions on quotient manifolds 3.4.3 The real projective space RP n−1 3.4.4 The Grassmann manifold Grass(p, n) 3 3.5 Tangent vectors and differential maps CONTENTS ix 5.3 Riemannian connection 5.3.1 Symmetric connections 5.3.2 Definition of the Riemannian connection 5.3.3 Riemannian connection on Riemannian submanifolds 5.3.4 Riemannian connection on quotient manifolds 5.4 Geodesics, exponential mapping, and parallel translation 5.5 Riemannian He… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

16
4,536
1
3

Year Published

2010
2010
2022
2022

Publication Types

Select...
3
3
2

Relationship

2
6

Authors

Journals

citations
Cited by 2,429 publications
(4,556 citation statements)
references
References 0 publications
16
4,536
1
3
Order By: Relevance
“…A brief introduction to the area can be found in [1] in this volume, and we refer to [3] and the many references therein for more details. Optimization on manifolds finds applications in two broad classes of situations: classical equality-constrained optimization problems where the constraints specify a submanifold of R n ; and problems where the objective function has continuous invariance properties that we want to eliminate for various reasons, e.g., efficiency, consistency, applicability of certain convergence results, and avoiding failure of certain algorithms due to degeneracy.…”
Section: Introductionmentioning
confidence: 99%
“…A brief introduction to the area can be found in [1] in this volume, and we refer to [3] and the many references therein for more details. Optimization on manifolds finds applications in two broad classes of situations: classical equality-constrained optimization problems where the constraints specify a submanifold of R n ; and problems where the objective function has continuous invariance properties that we want to eliminate for various reasons, e.g., efficiency, consistency, applicability of certain convergence results, and avoiding failure of certain algorithms due to degeneracy.…”
Section: Introductionmentioning
confidence: 99%
“…in [62,65,32,21,1] to cite just a few. The iteration we investigate in this paper does not enter into this family as it does not use the covariant derivative of the vector field of which we are trying to find the zeros, Moreover, we cannot recast it as an optimization problem on a Riemannian manifold, as stated above.…”
Section: Related Workmentioning
confidence: 99%
“…Newton algorithms on Riemannian manifolds were first proposed in the general context of optimization on manifolds [62,65]. Their convergence has been studied in depth in [47,32,1] to cite just a few of the important works. However, when the goal is to compute the Karcher mean, which is a non-linear least-squares problem, it is more efficient to use the Gauss-Newton variant which does not require the computation of the Hessian function of the Riemannian distance: this avoids implementing the connection.…”
Section: A Fixed Point Iteration To Compute the Karcher Meanmentioning
confidence: 99%
“…Here, we use the gradient descent with retractions from [6] on the product of the Stiefel manifold. To compute a gradient we use the Riemannian metric on the Stiefel manifold induced by the Euclidean one on R n×p and equip St(n, p) × St(n, p) with the product metric.…”
Section: Example: Fitting On So(n)mentioning
confidence: 99%
“…A descent algorithm on a manifold needs suitable local charts R X which map lines in the tangent space onto curves in the manifold. Here, we choose for the Stiefel manifold the polar decomposition retractions from [6]…”
Section: Example: Fitting On So(n)mentioning
confidence: 99%