Abstract. In this article we consider means of positive bounded linear operators on a Hilbert space. We present a complete theory that provides a framework which extends the theory of the Karcher mean, its approximating matrix power means, and a large part of Kubo-Ando theory to arbitrary many variables, in fact, to the case of probability measures with bounded support on the cone of positive definite operators. This framework characterizes each operator mean extrinsically as unique solutions of generalized Karcher equations which are obtained by exchanging the matrix logarithm function in the Karcher equation to arbitrary operator monotone functions over the positive real half-line. If the underlying Hilbert space is finite dimensional, then these generalized Karcher equations are Riemannian gradients of convex combinations of strictly geodesically convex log-determinant divergence functions, hence these new means are the global minimizers of them, in analogue to the case of the Karcher mean as pointed out. Our framework is based on fundamental contraction results with respect to the Thompson metric, which provides us nonlinear contraction semigroups in the cone of positive definite operators that form a decreasing net approximating these operator means in the strong topology from above.
We show that the Karcher mean of n points in any Hadamard space can be approximated by a natural explicitly constructed sequence. In the special case when the Hadamard space is the Riemannian manifold of positive definite matrices, this has been recently proved by J. Holbrook. A general version, in which the n points are assigned different weights, is established.
Abstract. We develop the theory of discrete-time gradient flows for convex functions on Alexandrov spaces with arbitrary upper or lower curvature bounds. We employ different resolvent maps in the upper and lower curvature bound cases to construct such a flow, and show its convergence to a minimizer of the potential function. We also prove a stochastic version, a generalized law of large numbers for convex function valued random variables, which not only extends Sturm's law of large numbers on nonpositively curved spaces to arbitrary lower or upper curvature bounds, but this version seems new even in the Euclidean setting. These results generalize those in nonpositively curved spaces (partly for squared distance functions) due to Bačák, Jost, Sturm and others, and the lower curvature bound case seems entirely new.
We generalize the theory of gradient flows of semi-convex functions on CAT(0)-spaces, developed by Mayer and Ambrosio-Gigli-Savaré, to CAT(1)-spaces. The key tool is the so-called "commutativity" representing a Riemannian nature of the space, and all results hold true also for metric spaces satisfying the commutativity with semi-convex squared distance functions. Our approach combining the semiconvexity of the squared distance function with a Riemannian property of the space seems to be of independent interest, and can be compared with Savaré's work on the local angle condition under lower curvature bounds. Applications include the convergence of the discrete variational scheme to a unique gradient curve, the contraction property and the evolution variational inequality of the gradient flow, and a Trotter-Kato product formula for pairs of semi-convex functions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.