This study mainly addresses the issues of an excessive model parameter count and computational complexity in Detection Transformer (DETR) for remote sensing object detection and similar neural networks. We propose an innovative neural network pruning method called “ant colony evolutionary pruning (ACEP)” which reduces the number of parameters in the neural network to improve the performance and efficiency of DETR-based neural networks in the remote sensing field. To retain the original network’s performance as much as possible, we combine population evolution and ant colony algorithms for dynamic search processes to automatically find efficient sparse sub-networks. Additionally, we design three different sparse operators based on the structural characteristics of DETR-like neural networks. Furthermore, considering the characteristics of remote sensing objects, we introduce sparsity constraints to each network layer to achieve efficient network pruning. The experimental results demonstrate that ACEP is effective on various DETR-like models. After removing a significant number of redundant parameters, it greatly improves the inference speed of these networks when performing remote sensing object detection tasks.