We treat the Feature Selection problem in the Support Vector Machine (SVM) framework by adopting an optimization model based on use of the 0 pseudo-norm. The objective is to control the number of non-zero components of normal vector to the separating hyperplane, while maintaining satisfactory classification accuracy. In our model the polyhedral norm. [k] , intermediate between. 1 and. ∞, plays a significant role, allowing us to come out with a DC (Difference of Convex) optimization problem that is tackled by means of DCA algorithm. The results of several numerical experiments on benchmark classification datasets are reported.