Unsupervised feature selection is a dimensionality reduction method and has been widely used as an important and indispensable preprocessing step in many tasks. However, realworld data are not only high-dimensional, but also have intrinsic correlations between data points, which are not fully utilized in feature selection. Furthermore, real-world data usually inevitably contain noise or outliers. In order to select features from data more effectively, a sparse regression model based on latent low-rank representation with the symmetric constraint for unsupervised feature selection is proposed. With the coefficient matrix of non-negative symmetric low-rank representation, an affinity matrix characterized by the correlation relationship between data points is adaptively obtained, which reveals the intrinsic geometric relationship, global structure, and discrimination of data points. A latent representation of all data points obtained from this affinity matrix is employed as a pseudolabel, and feature selection is carried out by sparse linear regression. This method performs feature selection in the learned latent space instead of the original data space. An alternating iteration algorithm is designed to solve the proposed model, and its effectiveness and efficiency are verified on several benchmark data sets.