2024
DOI: 10.1109/access.2024.3396990
|View full text |Cite
|
Sign up to set email alerts
|

Synthetic Data Pretraining for Hyperspectral Image Super-Resolution

Emanuele Aiello,
Mirko Agarla,
Diego Valsesia
et al.

Abstract: Large-scale self-supervised pretraining of deep learning models is known to be critical in several fields, such as language processing, where its has led to significant breakthroughs. Indeed, it is often more impactful than architectural designs. However, the use of self-supervised pretraining lags behind in several domains, such as hyperspectral images, due to data scarcity. This paper addresses the challenge of data scarcity in the development of methods for spatial super-resolution of hyperspectral images (… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 35 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?