In order to solve positive semidefinite (PSD) programs efficiently, a successful computational trick is to consider a relaxation, where PSD-ness is enforced only on a collection of submatrices. In order to study this formally, we consider the class of n × n symmetric matrices where we enforce PSD-ness on all k × k principal submatrices. We call a matrix in this class k-locally PSD. In order to compare the set of k-locally PSD matrices (denoted as S n,k ) to the set of PSD matrices, we study eigenvalues of k-locally PSD matrices. The key insight in this paper is that the eigenvalues of a matrix in S n,k are contained in a convex set, H(e n k ) (this can be defined as the hyperbolicity cone of the elementary symmetric polynomial e k n (where e n k (x) = S⊆[n]:|S|=k i∈S x i ) with respect to the all ones vector). Using this insight, we are able to improve previously known upper bounds on the Frobenius distance between matrices in S n,k from PSD matrices. We also study the quality of the convex relaxation H(e n k ). We first show that this relaxation is tight for the case of k = n − 1, that is, for every vector in H(e n n−1 ) there exists a matrix in S n,n−1 whose eigenvalues are equal to the components of the vector. We then prove a structure theorem that precisely characterizes the non-singular matrices in S n,k whose eigenvalues belong to the boundary of H(e n k ). This result indicates that "large parts" of the boundary of H(e n k ) do not intersect with the eigenvalues of matrices in S n,k when k < n − 1.