2018
DOI: 10.1109/tgrs.2018.2841036
|View full text |Cite
|
Sign up to set email alerts
|

Hyperspectral Unmixing Based on Incremental Kernel Nonnegative Matrix Factorization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
7
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 19 publications
(7 citation statements)
references
References 24 publications
0
7
0
Order By: Relevance
“…The statistical algorithms identify the endmembers and their corresponding abundances at the same time by utilizing the statistical properties of the HSI. Popular statistical algorithms include independent component analysis (ICA) [32], [33], nonnegative matrix factorization (NMF) [34], [35], and Bayesian approaches [36]- [38]. Among them, NMF provides a good fit for hyperspectral unmixing owing to its nonnegativity and interpretability.…”
Section: MVmentioning
confidence: 99%
See 1 more Smart Citation
“…The statistical algorithms identify the endmembers and their corresponding abundances at the same time by utilizing the statistical properties of the HSI. Popular statistical algorithms include independent component analysis (ICA) [32], [33], nonnegative matrix factorization (NMF) [34], [35], and Bayesian approaches [36]- [38]. Among them, NMF provides a good fit for hyperspectral unmixing owing to its nonnegativity and interpretability.…”
Section: MVmentioning
confidence: 99%
“…For dealing with large-scale and streaming dynamic data, Zhu et al [138] proposed an online KNMF (OKNMF) framework to control the computational complexity via adopting the stochastic gradient descent (SGD), mini-batch, and averaged SGD (ASGD) strategies. In addition, the KNMF was extended to incremental KNMF (IKNMF) and improved IKNMF (IIKNMF) for desired unmixing accuracy and efficiency [34].…”
Section: B Kernelized Nmfmentioning
confidence: 99%
“…Research work has been devoted to HU. Among it, non-negative matrix factorization (NMF) has been shown to be a useful unsupervised decomposition for hyperspectral unmixing [6]. The learned non-negative basis vectors that are used are distributed, yet they are still sparse combinations that generate expressiveness in the signal reconstructions [7].…”
Section: Introductionmentioning
confidence: 99%
“…These methods map the data to a feature space of higher dimension where the mapped data can be represented with a linear model in this space [26]. It is important to note that finding the explicit mapping is however bypassed via the kernel trick [27][28][29][30][31]. In [32,33], the authors proposed a model consisting of a linear mixture and a nonlinear fluctuation.…”
mentioning
confidence: 99%
“…This algorithm is characterized by the trade-off parameter λ NDU , fitting parameter µ NDU and penalty parameter ρ NDU . •The improved incremental KNMF (IIKNMF)[31]: This method extends KNMF by introducing partition matrix theory and considering the relationships among dividing blocks. The incremental KNMF (IKNMF) is proposed to reduce the computing requirements for large-scale data and IIKNMF aims to further improve the abundance results.…”
mentioning
confidence: 99%