Abstract. In this paper, we provide two types of sufficient conditions for ensuring the quadratic growth conditions of a class of constrained convex symmetric and non-symmetric matrix optimization problems regularized by nonsmooth spectral functions. These sufficient conditions are derived via the study of the C 2 -cone reducibility of spectral functions and the metric subregularity of their subdifferentials, respectively. As an application, we demonstrate how quadratic growth conditions are used to guarantee the desirable fast convergence rates of the augmented Lagrangian methods (ALM) for solving convex matrix optimization problems. Numerical experiments on an easy-toimplement ALM applied to the fastest mixing Markov chain problem are also presented to illustrate the significance of the obtained results.Key words. matrix optimization, spectral functions, quadratic growth conditions, metric subregularity, augmented Lagrangian function, fast convergence rates AMS subject classifications. 65K05, 90C25, 90C311. Introduction. The quadratic growth condition is an important concept in optimization. It is closely related to the metric subregularity and calmness of setvalued mappings (see Section 2 for definitions), and the existence of error bounds. From different perspectives, the study of the metric subregularity and the calmness of set-valued mappings plays a central role in variational analysis, such as nonsmooth calculus and perturbation analysis of variational problems. We refer the reader to the monograph by Dontchev and Rockafellar [17] for a comprehensive study on both theory and applications of related subjects. See also [31,18,24,37,25,26,43] and references therein for recent advances.Instead of considering general set-valued mappings, in this paper, we mainly focus on the solution mappings of convex matrix optimization problems. It is known from [1] that for convex problems, the metric subregularity of the subdifferentials of the essential objective functions (or the calmness of the solution mappings) can be equivalently characterized by the corresponding quadratic growth conditions. This connection motivates us to study the sufficient conditions for ensuring the latter properties. Beyond their own interests in second order variational analysis, the quadratic growth conditions can be employed for deriving the convergence rates of various first order and second order algorithms, including the proximal gradient methods [39,58], the proximal point algorithms [50,40,36,37] and the generalized Newton-type methods [22,19,42].The convex matrix optimization problems concerned in our paper take the fol-