Amid historically low response rates, survey researchers seek ways to reduce respondent burden while measuring desired concepts with precision. We propose to ask fewer questions of respondents and impute missing responses via probabilistic matrix factorization. A variance-minimizing active learning criterion chooses the most informative questions per respondent. In simulations of our matrix sampling procedure on real-world surveys, as well as a Facebook survey experiment, we find active question selection achieves efficiency gains over baselines. The reduction in imputation error is heterogeneous across questions, and depends on the latent concepts they capture. The imputation procedure can benefit from incorporating respondent side information, modeling responses as ordered logit rather than Gaussian, and accounting for order effects. With our method, survey researchers obtain principled suggestions of questions to retain and, if desired, can automate the design of shorter instruments.