This paper discusses robust classification of hyperspectral images. Both methods for dimensionality reduction and robust estimation of classifier parameters in full dimension are presented. A new approach to dimensionality reduction that uses piecewise constant function approximation of the spectral curve is compared to conventional dimensionality reduction methods like principal components, feature selection, and decision boundary feature extraction. Computing robust estimates of the decision boundary in full dimension is an alternative to dimensionality reduction. Two recently proposed techniques for covariance estimation based on the eigenvector decomposition and the Cholesky decomposition are compared to Support Vector Machine classifiers, simple regularized estimates, and regular quadratic classifiers. The experimental results on four different hyperspectral data sets demonstrate the importance of using simple, sparse models. The sparse model using Cholesky decomposition in full dimension performed slightly better than dimensionality reduction. However, if speed is an issue, the piecewise constant function approximation method for dimensionality reduction could be used.