2022
DOI: 10.1109/jstars.2022.3190316
|View full text |Cite
|
Sign up to set email alerts
|

Cross-Domain Classification of Multisource Remote Sensing Data Using Fractional Fusion and Spatial-Spectral Domain Adaptation

Abstract: Limitation of labeled samples has always been a challenge for hyperspectral image (HSI) classification. In real remote sensing applications, we encounter a situation where an HSI scene is not labeled at all. To solve this problem, crossdomain learning methods are developed by utilizing another HSI scene with similar land covers and sufficient labeled samples. However, the disparity between HSI scenes is still a challenge in reducing the classification performance, which may be affected by variations in illumin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(1 citation statement)
references
References 49 publications
0
1
0
Order By: Relevance
“…The differences in the classes could be shared between the two domains while detecting unknown classes in the target domain. Zhao et al [222] proposed the incorporation of LiDAR data using GAN networks to decrease the discrepancy between the source and target domains by reducing the per-pixel spectral shift. They also utilized a modified fractional differential mask (FrDM) method to extract spatial spectral information.…”
Section: Adversarial-basedmentioning
confidence: 99%
“…The differences in the classes could be shared between the two domains while detecting unknown classes in the target domain. Zhao et al [222] proposed the incorporation of LiDAR data using GAN networks to decrease the discrepancy between the source and target domains by reducing the per-pixel spectral shift. They also utilized a modified fractional differential mask (FrDM) method to extract spatial spectral information.…”
Section: Adversarial-basedmentioning
confidence: 99%