Let X 1 , . . . , Xn be an i.i.d. sample in R p with zero mean and the covariance matrix Σ * . The classical PCA approach recovers the projector P * J onto the principal eigenspace of Σ * by its empirical counterpart P J . Recent paper [24] investigated the asymptotic distribution of the Frobenius distance between the projectors P J − P * J 2 , while [27] offered a bootstrap procedure to measure uncertainty in recovering this subspace P * J even in a finite sample setup. The present paper considers this problem from a Bayesian perspective and suggests to use the credible sets of the pseudo-posterior distribution on the space of covariance matrices induced by the conjugated Inverse Wishart prior as sharp confidence sets. This yields a numerically efficient procedure. Moreover, we theoretically justify this method and derive finite sample bounds on the corresponding coverage probability. Contrary to [24,27], the obtained results are valid for non-Gaussian data: the main assumption that we impose is the concentration of the sample covariance Σ in a vicinity of Σ * . Numerical simulations illustrate good performance of the proposed procedure even on non-Gaussian data in a rather challenging regime.MSC 2010 subject classifications: Primary 62F15, 62H25, 62G20; secondary 62F25.I. Silin and V. Spokoiny /Bayesian inference for spectral projectors 2 trix:Usually one estimates the true unknown covariance by the sample covariance matrix, given byQuantifying the quality of approximation of Σ * by Σ is one of the most classical problems in statistics. Surprisingly, a number of deep and strong results in this area appeared quite recently. The progress is mainly due to Bernstein type results on the spectral norm Σ − Σ * ∞ in the random matrix theory, see, for instance, [22,29,31,33,1]. It appears that the quality of approximation is of order n −1/2 while the dimensionality p only enters logarithmically in the error bound. This allows to apply the results even in the cases of very high data dimension.Functionals of the covariance matrix also arise in applications frequently. For instance, eigenvalues are well-studied in different regimes, see [26,9,19,34] and many more references therein. The Frobenius norm and other l r -norms of covariance matrix are of great interest in financial applications; see, e.g. [10].Much less is known about the quality of estimation of a spectral projector which is a nonlinear functional of the covariance matrix. However, such objects arise in dimension reduction methods, manifold learning and spectral methods in community detection, see [11] and references therein for an overview of problems where spectral projectors play crucial role. Special attention should be focused on the Principal Component Analysis (PCA), probably the most famous dimension reduction method. Nowadays PCA-based methods are actively used in deep networking architecture [17] and finance [12], along with other applications. Over the past decade huge progress was achieved in theoretical guarantees for sparse PCA in high dimensions...