2021
DOI: 10.1109/jstars.2021.3072044
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive Nonnegative Sparse Representation for Hyperspectral Image Super-Resolution

Abstract: As the Hyperspectral (HS) images usually have low spatial resolution, hyperspectral image (HSI) super-resolution has recently attracted more and more attention to enhance the spatial resolution of HSIs. A common method is to fuse the low-resolution (LR) HSI with a multispectral image (MSI) whose spatial resolution is higher than the HSI. In this paper, we proposed a novel adaptive non-negative sparse representation (ANSR) based model to fuse an HSI and its corresponding MSI. First, basing the linear spectral u… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 20 publications
(7 citation statements)
references
References 62 publications
0
7
0
Order By: Relevance
“…However, since HSI fusion is an ill-posed problem, to reduce the number of possible solutions, some priors are proposed to enhance the robustness of the fusion methods. For example, low-rank regularizers are utilized in unmixing-based methods [21], [26], [27] to constrain the number of final spectra; dictionary-learning-based methods learn a spectral dictionary from the low-resolution HSI, map the spectral dictionary to RGB dictionary using known SSFs, and apply sparse regularizers to obtain the HR HSIs [19], [28], [29], [57]; Bayesian-based method [18] assumes that the representation coefficients of high-resolution HSI follow a Gaussian distribution, and use the assumption as a prior to accomplish high-resolution HSI fusion; tensor-based methods [16], [20], [22] exploit the redundancy of HSI, group non-local similar cube patches to aggregate tensors, and apply sparse representation to construct the tensors. Especially, Liu et al [15] recast the tensor-trace-norm formulation to reconstruct HR HSIs via low-rank approximation; Dian et al [23] use low tensor-train rank (LTTR) as a regularization term on the grouped tensors consisting of non-local patches.…”
Section: Related Workmentioning
confidence: 99%
“…However, since HSI fusion is an ill-posed problem, to reduce the number of possible solutions, some priors are proposed to enhance the robustness of the fusion methods. For example, low-rank regularizers are utilized in unmixing-based methods [21], [26], [27] to constrain the number of final spectra; dictionary-learning-based methods learn a spectral dictionary from the low-resolution HSI, map the spectral dictionary to RGB dictionary using known SSFs, and apply sparse regularizers to obtain the HR HSIs [19], [28], [29], [57]; Bayesian-based method [18] assumes that the representation coefficients of high-resolution HSI follow a Gaussian distribution, and use the assumption as a prior to accomplish high-resolution HSI fusion; tensor-based methods [16], [20], [22] exploit the redundancy of HSI, group non-local similar cube patches to aggregate tensors, and apply sparse representation to construct the tensors. Especially, Liu et al [15] recast the tensor-trace-norm formulation to reconstruct HR HSIs via low-rank approximation; Dian et al [23] use low tensor-train rank (LTTR) as a regularization term on the grouped tensors consisting of non-local patches.…”
Section: Related Workmentioning
confidence: 99%
“…Fu et al [19] introduced bidirectional structure in single HSI super-resolution to exploit spatial-spectral correlation of HSI and global correlation along spectra. Li [20] fused MSI and HSI by utilizing a novel adaptive nonnegative sparse representationbased model.…”
Section: Related Workmentioning
confidence: 99%
“…These methods effectively reduce data redundancy and preserve crucial information through dimensionality reduction and feature extraction. A novel adaptive non-negative sparse representation model is proposed for fusion in work [26]. Using a non-negative structured sparse representation model.…”
Section: Related Workmentioning
confidence: 99%