Truncated singular value decomposition (SVD), also known as the best low-rank matrix approximation, has been successfully applied to many domains such as biology, healthcare, and others, where high-dimensional datasets are prevalent. To enhance the interpretability of the truncated SVD, sparse SVD (SSVD) is introduced to select a few rows and columns of the original matrix along with the low rank approximation. Different from the literature, this paper presents a novel SSVD formulation that can select the best submatrix precisely up to a given size to maximize its truncated Ky Fan norm. The fact that the SSVD problem is NP-hard motivates us to study effective algorithms with provable performance guarantees. To do so, we first reformulate SSVD as a mixed-integer semidefinite program, which can be solved exactly for small-or medium-sized instances by a customized branch and cut algorithm with closed-form cuts, and is extremely useful to evaluate the quality of approximation algorithms. We next develop three selection algorithms based on different selection criteria and two searching algorithms-greedy and local search. We prove the approximation ratios for all the approximation algorithms and show that all the ratios are tight, i.e., we demonstrate that these approximation ratios are unimprovable. Finally, our numerical study demonstrates the high solution quality and computational efficiency of the proposed algorithms.