A deep neural network is suitable for remote sensing image pixel-wise classification because it effectively extracts features from the raw data. However, remote sensing images with higher spatial resolution exhibit smaller inter-class differences and greater intra-class differences; thus, feature extraction becomes more difficult. The attention mechanism, as a method that simulates the manner in which humans comprehend and perceive images, is useful for the quick and accurate acquisition of key features. In this study, we propose a novel neural network that incorporates two kinds of attention mechanisms in its mask and trunk branches; i.e., control gate (soft) and feedback attention mechanisms, respectively, based on the branches’ primary roles. Thus, a deep neural network can be equipped with an attention mechanism to perform pixel-wise classification for very high-resolution remote sensing (VHRRS) images. The control gate attention mechanism in the mask branch is utilized to build pixel-wise masks for feature maps, to assign different priorities to different locations on different channels for feature extraction recalibration, to apply stress to the effective features, and to weaken the influence of other profitless features. The feedback attention mechanism in the trunk branch allows for the retrieval of high-level semantic features. Hence, additional aids are provided for lower layers to re-weight the focus and to re-update higher-level feature extraction in a target-oriented manner. These two attention mechanisms are fused to form a neural network module. By stacking various modules with different-scale mask branches, the network utilizes different attention-aware features under different local spatial structures. The proposed method is tested on the VHRRS images from the BJ-02, GF-02, Geoeye, and Quickbird satellites, and the influence of the network structure and the rationality of the network design are discussed. Compared with other state-of-the-art methods, our proposed method achieves competitive accuracy, thereby proving its effectiveness.
This study explores the performance of Sentinel-2A Multispectral Instrument (MSI) imagery for extracting urban impervious surface using a modified linear spectral mixture analysis (MLSMA) method. Sentinel-2A MSI provided 10 m red, green, blue, and near-infrared spectral bands, and 20 m shortwave infrared spectral bands, which were used to extract impervious surfaces. We aimed to extract urban impervious surfaces at a spatial resolution of 10 m in the main urban area of Guangzhou, China. In MLSMA, a built-up image was first extracted from the normalized difference built-up index (NDBI) using the Otsu’s method; the high-albedo, low-albedo, vegetation, and soil fractions were then estimated using conventional linear spectral mixture analysis (LSMA). The LSMA results were post-processed to extract high-precision impervious surface, vegetation, and soil fractions by integrating the built-up image and the normalized difference vegetation index (NDVI). The performance of MLSMA was evaluated using Landsat 8 Operational Land Imager (OLI) imagery. Experimental results revealed that MLSMA can extract the high-precision impervious surface fraction at 10 m with Sentinel-2A imagery. The 10 m impervious surface map of Sentinel-2A is capable of recovering more detail than the 30 m map of Landsat 8. In the Sentinel-2A impervious surface map, continuous roads and the boundaries of buildings in urban environments were clearly identified.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.