2014
DOI: 10.1109/tip.2014.2362056
|View full text |Cite
|
Sign up to set email alerts
|

Blind and Fully Constrained Unmixing of Hyperspectral Images

Abstract: This paper addresses the problem of blind and fully constrained unmixing of hyperspectral images. Unmixing is performed without the use of any dictionary, and assumes that the number of constituent materials in the scene and their spectral signatures are unknown. The estimated abundances satisfy the desired sum-to-one and nonnegativity constraints. Two models with increasing complexity are developed to achieve this challenging task, depending on how noise interacts with hyperspectral data. The first one leads … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
37
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 39 publications
(37 citation statements)
references
References 31 publications
(51 reference statements)
0
37
0
Order By: Relevance
“…The first is that since the linear constraints of the ADMM are only satisfied asymptotically, we have no guarantee that all the entries of the supposedly discarded rows of the feature matrix Φ will be exactly zero (and this actually happens in practice). Then an arbitrary thresholding step is required to eliminate endmembers with a small contribution [13,1]. The second is that in order to obtain the appropriate sparsity level, the regularization parameter λ needs to be optimized through a grid search, which is computationally very costly, and requires a criterion to select the best run of the algorithm.…”
Section: Collaborative Sparsity For Hyperspectral Unmixingmentioning
confidence: 99%
“…The first is that since the linear constraints of the ADMM are only satisfied asymptotically, we have no guarantee that all the entries of the supposedly discarded rows of the feature matrix Φ will be exactly zero (and this actually happens in practice). Then an arbitrary thresholding step is required to eliminate endmembers with a small contribution [13,1]. The second is that in order to obtain the appropriate sparsity level, the regularization parameter λ needs to be optimized through a grid search, which is computationally very costly, and requires a criterion to select the best run of the algorithm.…”
Section: Collaborative Sparsity For Hyperspectral Unmixingmentioning
confidence: 99%
“…We assume a finite length blurring kernel of size L along the time dimension, centered around 0 which means that past and future values of x k contribute to the observation y k . In order to make the blurring kernel causal 2 , it has to be shifted by (L − 1)/2, which means that the observations needs to be delayed by…”
Section: Blurring and Causality Issuesmentioning
confidence: 99%
“…This information can be used to characterize objects with great precision and detail in a number of areas, including agricultural monitoring, industrial inspection, and defense. The core characteristics of hyperspectral images raise new data processing issues ranging from image restoration to pattern recognition [1], [2]. Several sensing techniques have been developed for hyperspectral imaging.…”
Section: Introductionmentioning
confidence: 99%
“…In SU, hyperspectral vectors are approximated by a linear combination of a "small" number of spectral signatures in the library, and the number of columns are equal to the number of pixels, thus the nonzero abundance lines should appear in only a few lines [35], which implies sparsity along the pixels of an HSI. Since the collaborative (also called "simultaneous" or "multitask") sparse regression approach has shown advantages over the noncollaborative ones, i.e., the mutual coherence has a weaker impact on the unmixing [18,34,36].…”
Section: Robust Collaborative Sparse Regressionmentioning
confidence: 99%