2021
DOI: 10.1109/jstars.2021.3092566
|View full text |Cite
|
Sign up to set email alerts
|

Spectral-Spatial Constrained Nonnegative Matrix Factorization for Spectral Mixture Analysis of Hyperspectral Images

Abstract: Hyperspectral Spectral Mixture Analysis (SMA), which intends to decompose mixed pixels into a collection of endmembers weighted by their corresponding fraction abundances, has been successfully used to tackle mixed-pixel problem in hyperspectral remote sensing applications. As an approach of decomposing a high-dimensional data matrix into the multiplication of two nonnegative matrices, Nonnegative Matrix Factorization (NMF) has shown its advantages and been widely applied to SMA. Unfortunately, most of the NMF… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
1

Relationship

3
1

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 59 publications
0
4
0
Order By: Relevance
“…4) Other weight learning Different from the above perspectives to establish a graph regularizer, [113] was based on the local linear embedding assumption, where the weight matrix W is learned by minimizing the following equation:…”
Section: Manifold Constraintsmentioning
confidence: 99%
See 2 more Smart Citations
“…4) Other weight learning Different from the above perspectives to establish a graph regularizer, [113] was based on the local linear embedding assumption, where the weight matrix W is learned by minimizing the following equation:…”
Section: Manifold Constraintsmentioning
confidence: 99%
“…Hence, the classical NMF model defined by the least-squares loss is sensitive to noise, leading to dramatically degrading the unmixing performance. To improve the robustness of NMF, many models have been reported based on certain metrics, including but not limited to bounded Itakura-Saito (IS) divergence [125], L 2,1 -norm regularizer [62], [113], [126], [127], CIM [90], [94], [128], [129], Cauchy function [130], and general robust loss function [131]. The bounded IS divergence was employed to address the additive, multiplicative, and mixed noises in HSIs [125].…”
Section: Robust Nmfmentioning
confidence: 99%
See 1 more Smart Citation
“…For LSMM, the traditional unmixing approaches involve the content related to geometry [15], [16], [17], statistics [18], and matrix decomposition [19], [20], [21], [22], [23], [24], [25], [26]. The assumption of the existence of pure endmembers needs to be satisfied in the geometry-based unmixing method, but this is not always true.…”
Section: Introductionmentioning
confidence: 99%