2018
DOI: 10.1109/tgrs.2018.2834567
|View full text |Cite
|
Sign up to set email alerts
|

Hyperspectral Unmixing Using Sparsity-Constrained Deep Nonnegative Matrix Factorization With Total Variation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
68
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 117 publications
(69 citation statements)
references
References 53 publications
1
68
0
Order By: Relevance
“…To verify the effectiveness of our proposed method, we conducted experiments on both simulated and real-world dataset. The compared hyperspectral unmixing methods include baseline methods VCA-FCLS [16] and NMF [23], sparsity-based methods L 1/2 -NMF [28] and graph-regularized L 1/2 -NMF (GLNMF) [37], spatial information based methods SGSNMF [41], TV-RSNMF [34], Multilayer NMF method MLNMF [52] and sparsity-constrained deep NMF with total variation (SDNMF-TV) [35]. The results were evaluated with two commonly used measures to assess the quantitative unmixing performance: spectral angle distance (SAD) and rootmean-square error (RMSE).…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…To verify the effectiveness of our proposed method, we conducted experiments on both simulated and real-world dataset. The compared hyperspectral unmixing methods include baseline methods VCA-FCLS [16] and NMF [23], sparsity-based methods L 1/2 -NMF [28] and graph-regularized L 1/2 -NMF (GLNMF) [37], spatial information based methods SGSNMF [41], TV-RSNMF [34], Multilayer NMF method MLNMF [52] and sparsity-constrained deep NMF with total variation (SDNMF-TV) [35]. The results were evaluated with two commonly used measures to assess the quantitative unmixing performance: spectral angle distance (SAD) and rootmean-square error (RMSE).…”
Section: Resultsmentioning
confidence: 99%
“…This is due to the fact that endmembers are distributed to form coherent geometric structures, and two correlated pixels usually have similar fractional abundances for the same endmembers. Therefore, the total variation (TV) regularizer [33]- [35] was incorporated to promote piece-wise smooth transitions in the abundance matrix for neighboring pixels of the same endmember category. In [36], abundance separation and smoothness constrained NMF (ASSNMF) was proposed for hyperspectral unmixing.…”
Section: Introductionmentioning
confidence: 99%
“…Finally, as deep learning for NMF has recently been preliminarily investigated (e.g. [43],), deep NMF models for binary image representation and decomposition would be worth investigating in future work.…”
Section: Discussionmentioning
confidence: 99%
“…The most popular algorithms for NMF belong to the class of multiplicative Lee-Seung algorithms, which have relatively low complexity but are characterized by slow convergence and risk becoming stuck in local minima [9]. To improve the performance of the NMF based hyperspectral unmixing method, further constraints were imposed on NMF [10][11][12][13][14]. Miao and Qi proposed a minimum volume constrained non-negative matrix factorization [15].…”
Section: Introductionmentioning
confidence: 99%