Abstract. Change Detection (CD) is one of the most crucial applications in remote sensing which identifies meaningful changes from bitemporal images taken from the same location. Enhancing the temporal efficiency and accuracy of this task is of great importance and one way to achieve this is through transfer learning. In this study, we investigate the influence of transferring pre-trained weights on the performance of a Siamese CD network using a benchmark dataset. For this purpose, an autoencoder with the same encoder architecture as in the Siamese model is trained on the whole dataset. Then, the encoder weights are transferred from the autoencoder and the Siamese model is trained in two modes. In the first mode, the transferred weights are frozen and only the decoder section of the Siamese models is trained while the second mode trains the whole model without freezing any part of the model. Moreover, the Siamese model is also trained without using the pre-trained weights to set the basis for comparisons. The results indicate that freezing the encoder results in a relatively lower performance but offers a considerable amount of temporal efficiency in the training phase. On the other hand, training the whole model after the weight transfer acquires the best result with an improvement of 12.43% in the Intersection over Union (IoU) metric.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.