2021
DOI: 10.48550/arxiv.2111.09708
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Trainable Spectral-Spatial Sparse Coding Model for Hyperspectral Image Restoration

Abstract: Hyperspectral imaging offers new perspectives for diverse applications, ranging from the monitoring of the environment using airborne or satellite remote sensing, precision farming, food safety, planetary exploration, or astrophysics. Unfortunately, the spectral diversity of information comes at the expense of various sources of degradation, and the lack of accurate ground-truth "clean" hyperspectral signals acquired on the spot makes restoration tasks challenging. In particular, training deep neural networks … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 58 publications
(95 reference statements)
0
1
0
Order By: Relevance
“…Traditional single HSI SR methods develop a mapping function from LR to HR HSIs, often relying on handcrafted prior knowledge (e.g., low-rank approximations [8] and sparse coding [9]) to address the inherent uncertainty in HR-HSI reconstruction. In these methods, prior knowledge acts as regularization to simulate image degradation in a forward mathematical model that captures the spectral properties and spatial structure of the input.…”
Section: Introductionmentioning
confidence: 99%
“…Traditional single HSI SR methods develop a mapping function from LR to HR HSIs, often relying on handcrafted prior knowledge (e.g., low-rank approximations [8] and sparse coding [9]) to address the inherent uncertainty in HR-HSI reconstruction. In these methods, prior knowledge acts as regularization to simulate image degradation in a forward mathematical model that captures the spectral properties and spatial structure of the input.…”
Section: Introductionmentioning
confidence: 99%