Abstract. Positive definite matrices abound in a dazzling variety of applications. This ubiquity can be in part attributed to their rich geometric structure: positive definite matrices form a selfdual convex cone whose strict interior is a Riemannian manifold. The manifold view is endowed with a "natural" distance function while the conic view is not. Nevertheless, drawing motivation from the conic view, we introduce the S-Divergence as a "natural" distance-like function on the open cone of positive definite matrices. We motivate the S-divergence via a sequence of results that connect it to the Riemannian distance. In particular, we show that (a) this divergence is the square of a distance; and (b) that it has several geometric properties similar to those of the Riemannian distance, though without being computationally as demanding. The S-Divergence is even more intriguing: although nonconvex, we can still compute matrix means and medians using it to global optimality. We complement our results with some numerical experiments illustrating our theorems and our optimization algorithm for computing matrix medians.Key words. Bregman matrix divergence; Log Determinant; Stein Divergence; Jensen-Bregman divergence; matrix geometric mean; matrix median; nonpositive curvature 1. Introduction. Hermitian positive definite (HPD) matrices are a noncommutative generalization of positive reals. They abound in a multitude of applications and exhibit attractive geometric properties-e.g., they form a differentiable Riemannian (also Finslerian) manifold [10,33] that is a well-studied example of a manifold of nonpositive curvature [17, Ch.10]. HPD matrices possess even more structure: (i) they embody a canonical higher-rank symmetric space [51]; and (ii) their closure forms a closed, self-dual convex cone.The convex conic view enjoys great importance in convex optimization [6,43,44] and in nonlinear Perron-Frobenius theory [40]; symmetric spaces are important in algebra, analysis [32,39,51], and optimization [43,52]; while the manifold view (Riemannian or Finslerian) plays diverse roles-see [10, Ch.6] and [46].The manifold view is equipped with a with a "natural" distance function while the conic view is not. Nevertheless, drawing motivation from the convex conic view, we introduce the S-Divergence as a "natural" distance-like function on the open cone of positive definite matrices. Indeed, we prove a sequence of results connecting the S-Divergence to the Riemannian distance. Most importantly, we show that (a) this divergence is the square of a distance; and (b) that it has several geometric properties in common with the Riemannian distance, without being numerically as demanding. This builds an informal link between the manifold and conic views of HPD matrices.1.1. Background and notation. We begin by fixing notation. The letter H denotes some Hilbert space, usually just C n . The inner product between two vectors x and y in H is x, y := x * y (x * denotes 'conjugate transpose'). The set of n × n Hermitian matrices is denoted as H...