In this paper, we study the effect of different regularizers and their implications in highdimensional image classification and sparse linear unmixing. Although kernelization or sparse methods are globally accepted solutions for processing data in high dimensions, we present here a study on the impact of the form of regularization used and its parameterization.We consider regularization via traditional squared (!2) and sparsity-promoting (!1) norms, as well as more unconventional non convex regularizers (!p and log sum penalty). We compare their properties and advantages on several classification and linear unmoving tasks and provide advices on the choice of the best regularizer for the problemat hand. Finally,we also provide a fully functional toolbox for the community. This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING 1 Nonconvex Regularization in Remote SensingDevis Tuia, Senior Member, IEEE, Rémi Flamary, and Michel Barlaud, Senior Member, IEEE Abstract-In this paper, we study the effect of different regularizers and their implications in high-dimensional image classification and sparse linear unmixing. Although kernelization or sparse methods are globally accepted solutions for processing data in high dimensions, we present here a study on the impact of the form of regularization used and its parameterization. We consider regularization via traditional squared ( 2 ) and sparsity-promoting ( 1 ) norms, as well as more unconventional nonconvex regularizers ( p and log sum penalty). We compare their properties and advantages on several classification and linear unmixing tasks and provide advices on the choice of the best regularizer for the problem at hand. Finally, we also provide a fully functional toolbox for the community.