While matrix completion algorithms fill missing entries in a matrix given a subset of samples, how to best pre-select matrix entries to collect informative observations given a sampling budget is largely unaddressed. In this paper, we propose a fast graph sampling algorithm to select entries for matrix completion. Specifically, we first regularize the matrix reconstruction objective using a dual graph signal smoothness prior, resulting in a system of linear equations for solution. We then seek to maximize the smallest eigenvalue λ min of the coefficient matrix by choosing appropriate samples, thus maximizing the stability of the linear system. To efficiently optimize this combinatorial problem, we derive a greedy sampling strategy, leveraging on Gershgorin circle theorem, that iteratively selects one sample at a time corresponding to the largest magnitude entry in the first eigenvector of a shifted graph Laplacian matrix. Our algorithm benefits computationally from warm start as the first eigenvectors of progressively shifted Laplacian matrices are computed recurrently for more samples. Towards computation-scalable sampling on large matrices, we further rewrite the coefficient matrix as a sum of two separate components, each of which exhibits attractive block-diagonal structure that we exploit for alternating block-wise sampling. Extensive experiments on both synthetic and real-world datasets show that our graph sampling algorithms substantially outperform existing sampling schemes for matrix completion, when combined with a range of modern matrix completion algorithms.Preprint. Under review.