Bilinear mixture model-based methods have recently shown promising capability in nonlinear spectral unmixing. However, relying on the endmembers extracted in advance, their unmixing accuracies decrease, especially when the data is highly mixed. In this paper, a strategy of geometric projection has been provided and combined with constrained nonnegative matrix factorization for unsupervised nonlinear spectral unmixing. According to the characteristics of bilinear mixture models, a set of facets are determined, each of which represents the partial nonlinearity neglecting one endmember. Then, pixels' barycentric coordinates with respect to every endmember are calculated in several newly constructed simplices using a distance measure. In this way, pixels can be projected into their approximate linear mixture components, which reduces greatly the impact of collinearity. Different from relevant nonlinear unmixing methods in the literature, this procedure effectively facilitates a more accurate estimation of endmembers and abundances in constrained nonnegative matrix factorization. The updated endmembers are further used to reconstruct the facets and get pixels' new projections. Finally, endmembers, abundances, and pixels' projections are updated alternately until a satisfactory result is obtained. The superior performance of the proposed algorithm in nonlinear spectral unmixing has been validated through experiments with both synthetic and real hyperspectral data, where traditional and state-of-the-art algorithms are compared.
Kernel methods are attractive in data analysis as they can model nonlinear similarities between observations and provide means to rich representations, both of which are useful for the regression problems in general domains. Despite their popularity, they suffer from two primary inherent drawbacks. One drawback is the positive definiteness requirement of the kernel functions, which greatly restricts their applications to some real data analysis. The other drawback is their poor scalability in massive data scenarios. In this paper, we aim to address these two problems by considering the Nyström subsampling approach for coefficient-based regularized regression (or Nyström CRR for short). Nyström subsampling is an effective approach to reduce the space and time complexity by constructing a low-rank approximation of the original kernel matrix through column subsampling. Coefficient-based regularized regression can provide a simple paradigm for designing indefinite kernel methods. We show that a combination of these two schemes is not only computationally efficient but also statistically consistent with a mini-max optimal rates of convergence. We explicitly determine the lower bound of subsampling level as a function of the sample size such that the mini-max optimal convergence rates can be preserved. From our analysis, the subsampling level plays a role as a trade-off between computational and asymptotic behaviors of Nyström CRR, and hence is pivotal for algorithmic performances. In order to choose an appropriate subsampling level, we develop an incremental Nyström CRR for nested subsampling sets. The proposed algorithm can greatly reduce the cost of cross-validation, and even allows to compute the whole solution path through all possible subsampling levels. Based on our empirical studies, the incremental Nyström CRR can perform effective model selections and achieve the state-of-the-art results on both synthetic and real data sets.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.