In recent years, deep convolutional neural networks (DCNNs) have been widely used for object detection tasks in remote sensing images. However, the over-parametrization problem of DCNNs hinders their application in resource-constrained remote sensing devices. In order to solve this problem, we propose a network pruning method (named absorption pruning) to compress the remote sensing object detection network. Unlike the classical iterative three-stage pruning pipeline used in existing methods, absorption pruning is designed as a four-stage pruning pipeline that only needs to be executed once, which differentiates it from existing methods. Furthermore, the absorption pruning no longer identifies unimportant filters, as in existing pruning methods, but instead selects filters that are easy to learn. In addition, we design a method for pruning ratio adjustment based on the object characteristics in remote sensing images, which can help absorption pruning to better compress deep neural networks for remote sensing image processing. The experimental results on two typical remote sensing data sets—SSDD and RSOD—demonstrate that the absorption pruning method not only can remove 60% of the filter parameters from CenterNet101 harmlessly but also eliminate the over-fitting problem of the pre-trained network.
Multi-scale target detection in synthetic aperture radar (SAR) images is one of the key techniques of SAR image interpretation, which is widely used in national defense and security. However, multi-scale targets include several types. For example, targets with similar-scale, large-scale, and ultra-large-scale differences coexist in SAR images. In particular, it is difficult for existing target detection methods to detect both ultra-large-scale targets and ultra-small-scale targets in SAR images, resulting in poor detection results for these two types of targets. To solve these problems, this paper proposes an ultra-high precision deep learning network (UltraHi-PrNet) to detect dense multi-scale targets. Firstly, a novel scale transfer layer is constructed to transfer the features of targets of different scales from bottom networks to top networks, ensuring that the features of ultra-small-scale, small-scale, and medium-scale targets in SAR images can be extracted more easily. Then, a novel scale expansion layer is constructed to increase the range of the receptive field of feature extraction without increasing the feature resolution, ensuring that the features of large-scale and ultra-large-scale targets in SAR images can be extracted more easily. Next, the scale expansion layers with different expansion rates are densely connected to different stages of the backbone network, and the features of the target with ultra-large-scale differences are extracted. Finally, the classification and regression of targets were achieved based on Faster R-CNN. Based on the SAR ship detection dataset (SSDD), AIR-SARShip-1.0, high-resolution SAR ship detection dataset-2.0 (high-resolution SSDD-2.0), the SAR-ship-dataset, and the Gaofen-3 airport dataset, the experimental results showed that this method can detect similar-scale, large-scale, and ultra-large-scale targets more easily. At the same time, compared with other advanced SAR target detection methods, the proposed method can achieve higher accuracy.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.