The hat matrix is an important auxiliary quantity in linear regression theory for detecting errors in predictors. Traditionally, the comparison of the diagonal elements with a calibration point serves as decision rule for separating a dominant linear population from outliers. However, several problems exist : first, the calibration point is not well defined because no exact statistical distribution (asymptotic form) of the hat matrix diagonal exists [1]. Secondly, being based on the standard covariance matrix, this outlying measure looses its efficiency when the rate of "atypical" observations becomes large [2] [3]. In this paper, we present a discriminative version of the hat matrix (DHM) which transposes this classification problem into a subspace clustering problem. We propose a linear discriminant analysis based criterion directly built on the properties of the hat matrix and we show that its maximization leads to search an optimal projection subspace and an optimal indicator matrix. We also show that the statistic of the hat matrix diagonal "projected" on this optimal subspace has an exact χ 2 behaviour and thus makes it possible to identify outliers by way of hyptothesis testing. Synthetic data sets are used to study the performance both in terms of regression and classification of the proposed approach. We also illustrate its potential application to motion segmentation in image sequences.