2016
DOI: 10.1109/tgrs.2016.2585201
|View full text |Cite
|
Sign up to set email alerts
|

Nonconvex Regularization in Remote Sensing

Abstract: In this paper, we study the effect of different regularizers and their implications in highdimensional image classification and sparse linear unmixing. Although kernelization or sparse methods are globally accepted solutions for processing data in high dimensions, we present here a study on the impact of the form of regularization used and its parameterization.We consider regularization via traditional squared (!2) and sparsity-promoting (!1) norms, as well as more unconventional non convex regularizers (!p an… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
15
0

Year Published

2017
2017
2019
2019

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 27 publications
(15 citation statements)
references
References 70 publications
0
15
0
Order By: Relevance
“…The selected VI were finally tested to discriminate among treatments using L 2 -Regularized Logistic Regression (RLR), also known as Ridge regression (Friedman & Popescu 2004, Hoerl & Kennard 1970. RLR has been widely used for classification purposes in various domains including remote sensing (Tuia et al 2016, Zhang et al 2015, and revealed to be efficient when applied on vegetation (Erudel et al 2017). We only trained the RLR classifier at leaf scale to assess the robustness of VI at higher acquisition scales.…”
Section: Vegetation Indicesmentioning
confidence: 99%
“…The selected VI were finally tested to discriminate among treatments using L 2 -Regularized Logistic Regression (RLR), also known as Ridge regression (Friedman & Popescu 2004, Hoerl & Kennard 1970. RLR has been widely used for classification purposes in various domains including remote sensing (Tuia et al 2016, Zhang et al 2015, and revealed to be efficient when applied on vegetation (Erudel et al 2017). We only trained the RLR classifier at leaf scale to assess the robustness of VI at higher acquisition scales.…”
Section: Vegetation Indicesmentioning
confidence: 99%
“…This limits the applicability of proximal methods to solve this type of problems, and other types of algorithms (or of nonconvex sparsity inducing penalties) have been investigated in remote sensing (see e.g. [49] and references therein).…”
Section: Data: X Bmentioning
confidence: 99%
“…While we acknowledge that imposing a lower value on a n may act as a sparse regularizer, in our experiments, however, the best 1 -norm-based solution was always found with a n 1, that is, the FCLS solution (1). The non-convex p,p<1 -norm regularization recently proposed in [10] may be more suited for enforcing sparsity, although optimization strategies in [10] also discard the sumto-one constraint. Standard greedy iterative algorithms [13] are also unadapted to cope with the sum-to-one constraint and with non-normalized dictionaries: since endmembers are reflectance spectra, the dictionary cannot be normalized without affecting the abundance estimation [12].…”
Section: Cardinality Constraint ( 0 -Norm Sparsity)mentioning
confidence: 99%
“…Cardinality constraints resort to combinatorial optimization. Sparse spectral unmixing has been proposed e.g., with 1 -norm [8,9] or p, p<1 -norm [10] regularization, or with a greedy Orthogonal Matching Pursuit procedure [9]. While computationally attractive, such approaches usually do not solve the original sparse problem, and they face difficulties in considering the sum-to-one constraint and nonnormalized dictionaries.…”
Section: Introductionmentioning
confidence: 99%