Representing images and videos with Symmetric Positive Definite (SPD) matrices and utilizing the intrinsic Riemannian geometry of the resulting manifold has proved successful in many computer vision tasks. Since SPD matrices lie in a nonlinear space known as a Riemannian manifold, researchers have recently shown a growing interest in learning discriminative SPD matrices with appropriate Riemannian metrics. However, the computational complexity of analyzing high-dimensional SPD matrices is nonnegligible in practical applications. Inspired by the theory of nonparametric estimation, we propose a probability distribution-based approach to overcome this drawback by learning a mapping from the manifold of high-dimensional SPD matrices to a manifold of lower-dimension, which can be expressed as an optimization problem on the Grassmann manifold. Specifically, we perform the dimensionality reduction for high-dimensional SPD matrices with popular Riemannian metrics and an affinity matrix constructed using an estimated probability distribution function (PDF) to achieve maximum class separability. The evaluation of several classification tasks shows the competitiveness of the proposed approach compared with state-ofthe-art methods.
The class of symmetric positive definite (SPD) matrices, especially in the form of covariance descriptors (CovDs), have been receiving increased interest for many computer vision tasks. Covariance descriptors offer a compact way of robustly fusing different types of features with measurement variations. Successful examples of applying CovDs addressing various classification problems include object recognition, face recognition, human tracking, texture categorization, visual surveillance, etc.As a novel data descriptor, CovDs encode the second-order statistics of features extracted from a finite number of observation points (e.g., the pixels of an image) and capture the relative correlation of these features along their powers as a means of representation. In general, CovDs are SPD matrices and it is well known that the space of SPD matrices (denoted by Sym + ) is not a subspace in Euclidean space but a Riemannian manifold with nonpositive curvature. As a consequence, conventional learning methods based on Euclidean geometry are not the optimal choice for CovDs, as proven in several prior studies.In order to better cope with the Riemannian structure of CovDs, many methods based on nonEuclidean metrics (e.g., affine-invariant metrics, logEuclidean metrics, Bregman divergence, and Stein metrics) have been proposed over the last few years. In particular, the log-Euclidean metric possesses several desirable properties which are beneficial for classification: (i) it is fast to compute; (ii) it defines a true geodesic on Sym + ; and (iii) it comes up with
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.