Optical remote sensing imagery is at the core of many Earth observation activities. The regular, consistent and global-scale nature of the satellite data is exploited in many applications, such as cropland monitoring, climate change assessment, land-cover and land-use classification, and disaster assessment. However, one main problem severely affects the temporal and spatial availability of surface observations, namely cloud cover. The task of removing clouds from optical images has been subject of studies since decades. The advent of the Big Data era in satellite remote sensing opens new possibilities for tackling the problem using powerful data-driven deep learning methods. In this paper, a deep residual neural network architecture is designed to remove clouds from multispectral Sentinel-2 imagery. SAR-optical data fusion is used to exploit the synergistic properties of the two imaging systems to guide the image reconstruction. Additionally, a novel cloud-adaptive loss is proposed to maximize the retainment of original information. The network is trained and tested on a globally sampled dataset comprising real cloudy and cloud-free images. The proposed setup allows to remove even optically thick clouds by reconstructing an optical representation of the underlying land surface structure.
The majority of optical observations acquired via spaceborne Earth imagery are affected by clouds. While there is numerous prior work on reconstructing cloud-covered information, previous studies are, oftentimes, confined to narrowly defined regions of interest, raising the question of whether an approach can generalize to a diverse set of observations acquired at variable cloud coverage or in different regions and seasons. We target the challenge of generalization by curating a large novel data set for training new cloud removal approaches and evaluate two recently proposed performance metrics of image quality and diversity. Our data set is the first publically available to contain a global sample of coregistered radar and optical observations, cloudy and cloud-free. Based on the observation that cloud coverage varies widely between clear skies and absolute coverage, we propose a novel model that can deal with either extreme and evaluate its performance on our proposed data set. Finally, we demonstrate the superiority of training models on real over synthetic data, underlining the need for a carefully curated data set of real observations. To facilitate future research, our data set is made available online.
<p>Meteorological satellites continuously provide a wealth of data while observing the weather from space. With the next generation satellites and space-borne imagers, the spatial resolution, temporal resolution and spatial/spectral coverage is increasing, further increasing the amount of data. While this is certainly useful for severe weather monitoring and forecasting, it also puts high demands on the tools used to read, process, visualise and analyse the data. The heterogeneous nature of both the satellite data as well as possible reference data from e.g. ground-based networks requires further flexibility and modularity for effective data handling. Similarly, it is usually not feasible for a single user or user group to be familiar with all the different satellite data formats including how to open the files and how to extract and visualize the relevant data.</p> <p>The Satellite Information Familiarization Tool &#8211; SIFT &#8211; is a free and open source interactive tool initially developed at SSEC University of Wisconsin for the visualisation and analysis of satellite imagery data. In preparation for Meteosat Third Generation (MTG) and the launch of the Flexible Combined Imager (FCI) and Lightning Imager (LI) in 2022, EUMETSAT has further developed SIFT, with the main objective of having an easy to use and responsive software compatible with data from different instruments that can be used for different tasks including e.g. data visualization and analysis, Cal/Val activities and training. SIFT also utilises several of the Pytroll open source libraries (e.g. Satpy) to read and process the data, thus enabling access to the visualization and analysis of a multitude of different data sources.</p> <p>The new version of SIFT, including the latest developments led by EUMETSAT, is expected to be released to users in 2023 and with this presentation we want to show potential users how this tool can be used to visualize satellite data for severe storm analysis including a live demonstration.</p>
No abstract
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.