In this paper, we study the effect of different regularizers and their implications in high-dimensional image classification and sparse linear unmixing. Although kernelization or sparse methods are globally accepted solutions for processing data in high dimensions, we present here a study on the impact of the form of regularization used and its parameterization. We consider regularization via traditional squared (ℓ2) and sparsity-promoting (ℓ1) norms, as well as more unconventional nonconvex regularizers (ℓp and log sum penalty). We compare their properties and advantages on several classification and linear unmixing tasks and provide advices on the choice of the best regularizer for the problem at hand. Finally, we also provide a fully functional toolbox for the community.
|Number of pages||11|
|Journal||IEEE Transactions on Geoscience and Remote Sensing|
|Publication status||Published - Nov 2016|
- remote sensing