The main contribution of this paper is the derivation of non-asymptotic convergence rates for Nystr"om kernel CCA in a setting of statistical learning. Our theoretical results reveal that, under certain conditions, Nystr"om kernel CCA can achieve a convergence rate comparable to that of the standard kernel CCA, while offering significant computational savings. This finding has important implications for the practical application of kernel CCA, particularly in scenarios where computational efficiency is crucial. Numerical experiments are provided to demonstrate the effectiveness of Nystr"om kernel CCA.