Optical remote sensing image detection has wide-ranging applications in both military and civilian sectors. Addressing the specific challenge of false positives and missed detections in optical remote sensing image analysis due to object size variations, a lightweight remote sensing image detection method based on an improved YOLOv5n has been proposed. This technology allows for rapid and effective analysis of remote sensing images, real-time detection, and target localization, even in scenarios with limited computational resources in current machines/systems. To begin with, the YOLOv5n feature fusion network structure incorporates an adaptive spatial feature fusion mechanism to enhance the algorithm's ability to fuse features of objects at different scales. Additionally, an SIoU loss function has been developed based on the original YOLOv5n positional loss function, redefining the vector angle between position frame regressions and the penalty index. This adjustment aids in improving the convergence speed of model training and enhancing detection performance. To validate the effectiveness of the proposed method, experimental comparisons were conducted using optical remote sensing image datasets. The experimental results on optical remote sensing images serve to demonstrate the efficiency of this advanced technology. The findings indicate that the average mean accuracy of the improved network model has increased from the original 81.6% to 84.9%. Moreover, the average detection speed and network complexity are significantly superior to those of the other three existing object detection algorithms.