2021
DOI: 10.3390/rs13214394
|View full text |Cite
|
Sign up to set email alerts
|

Deep/Transfer Learning with Feature Space Ensemble Networks (FeatSpaceEnsNets) and Average Ensemble Networks (AvgEnsNets) for Change Detection Using DInSAR Sentinel-1 and Optical Sentinel-2 Satellite Data Fusion

Abstract: Differential interferometric synthetic aperture radar (DInSAR), coherence, phase, and displacement are derived from processing SAR images to monitor geological phenomena and urban change. Previously, Sentinel-1 SAR data combined with Sentinel-2 optical imagery has improved classification accuracy in various domains. However, the fusing of Sentinel-1 DInSAR processed imagery with Sentinel-2 optical imagery has not been thoroughly investigated. Thus, we explored this fusion in urban change detection by creating … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 7 publications
(2 citation statements)
references
References 42 publications
(84 reference statements)
0
2
0
Order By: Relevance
“…While simple concatenation of Sentinel-2 and Sentinel-1 bands has been found lacking [2], more effective fusion techniques have been developed for combining Sentinel-2 optical imagery with Sentinel-1 SAR data including a novel self-supervised framework for SARoptical data fusion [30] and a multibranch convolutional neural network (MBCNN) that significantly enhances coastal land cover classification accuracy by effectively mining and fusing multitemporal and multisensor Sentinel data, [31]. Notably, ensembles of pretrained SAR and RGB models have shown optimal performance [32], with methods such as the fusion of Sentinel-1 DInSAR imagery with Sentinel-2 for urban change detection demonstrating the superiority of transfer learning by feature extraction (TLFE) and the introduction of ensemble methods (FeatSpaceEnsNet, AvgEnsNet, HybridEnsNet) over traditional models. Furthermore, the fusion of multispectral satellite data with panchromatic images using a multi-branch approach has been explored [33].…”
Section: Related Workmentioning
confidence: 99%
“…While simple concatenation of Sentinel-2 and Sentinel-1 bands has been found lacking [2], more effective fusion techniques have been developed for combining Sentinel-2 optical imagery with Sentinel-1 SAR data including a novel self-supervised framework for SARoptical data fusion [30] and a multibranch convolutional neural network (MBCNN) that significantly enhances coastal land cover classification accuracy by effectively mining and fusing multitemporal and multisensor Sentinel data, [31]. Notably, ensembles of pretrained SAR and RGB models have shown optimal performance [32], with methods such as the fusion of Sentinel-1 DInSAR imagery with Sentinel-2 for urban change detection demonstrating the superiority of transfer learning by feature extraction (TLFE) and the introduction of ensemble methods (FeatSpaceEnsNet, AvgEnsNet, HybridEnsNet) over traditional models. Furthermore, the fusion of multispectral satellite data with panchromatic images using a multi-branch approach has been explored [33].…”
Section: Related Workmentioning
confidence: 99%
“…One reason for the success of deep learning models is their ability to transfer previous learning to new tasks. In image classification, this transfer leads to more robust models and faster training [8][9][10][11][12][13]. Despite the importance of transfer in deep learning, there has been little insight into the nature of transferring relational knowledge-that is, the representations learnt by graph neural networks.…”
Section: Introduction and Related Workmentioning
confidence: 99%