Abstract. This is the first paper dealing with the study of weak sharp minima for constrained optimization problems on Riemannian manifolds, which are important in many applications. We consider the notions of local weak sharp minima, boundedly weak sharp minima, and global weak sharp minima for such problems and obtain their complete characterizations in the case of convex problems on finite-dimensional Riemannian manifolds and their Hadamard counterparts. A number of the results obtained in this paper are also new for the case of conventional problems in linear spaces. Our methods involve appropriate tools of variational analysis and generalized differentiation on Riemannian and Hadamard manifolds developed and efficiently implemented in this paper.Key words. Variational analysis and optimization, Weak sharp minima, Riemannian manifolds, Hadamard manifolds, Convexity, Generalized differentiability AMS subject classifications. Primary 49J52; Secondary 90C31 1. Introduction. A vast majority of problems considered in optimization theory are formulated in finite-dimensional or infinite-dimensional Banach spaces, where the linear structure plays a crucial role to employ conventional tools of variational analysis and (classical or generalized) differentiation to deriving optimality conditions and then develop numerical algorithms. At the same time many optimization problems arising in various applications cannot be posted in linear spaces and require a Riemannian manifold (in particular, a Hadamard manifold) structure for their formalization and study. Among various problems of this type we mention geometric models for human spine [2], eigenvalue optimization problems [15,45,58],