It has been proved that the modified form of ridge regularized linear models (MRRLMs) can get ''very close'' to identifying a subset of Markov boundary. However, it is assumed that the covariance matrix is non-singular, so MRRLMs cannot be applied to discover the Markov boundary (subset) from data sets when the covariance matrix is singular. The singularity of the covariance matrix means that there are some collinear variables in the data sets, and such data sets exist widely in the real world. In this paper, we present a novel variant of ridge regularized linear models (VRRLMs) to identify a subset of Markov boundary from data sets with collinear and non-collinear variables and, then, reveal the relationship between covariance matrix and collinearity of variables in the theory. In addition, we prove theoretically that the VRRLMs can identify a subset of Markov boundary under some reasonable assumptions and verify the theory on the four discrete data sets. The results show that VRRLMs outperform the MRRLMs in discovering a subset of Markov boundary on the data sets with collinear variables, while both of them have a similar discovery efficiency of the Markov boundary (subset) on the data sets with non-collinear variables.