To satisfy the obstacle avoidance requirements of unmanned agricultural machinery during autonomous operation and address the challenge of rapid obstacle detection in complex field environments, an improved field obstacle detection model based on YOLOv8 was proposed. This model enabled the fast detection and recognition of obstacles such as people, tractors, and electric power pylons in the field. This detection model was built upon the YOLOv8 architecture with three main improvements. First, to adapt to different tasks and complex environments in the field, improve the sensitivity of the detector to various target sizes and positions, and enhance detection accuracy, the CBAM (Convolutional Block Attention Module) was integrated into the backbone layer of the benchmark model. Secondly, a BiFPN (Bi-directional Feature Pyramid Network) architecture took the place of the original PANet to enhance the fusion of features across multiple scales, thereby increasing the model’s capacity to distinguish between the background and obstacles. Third, WIoU v3 (Wise Intersection over Union v3) optimized the target boundary loss function, assigning greater focus to medium-quality anchor boxes and enhancing the detector’s overall performance. A dataset comprising 5963 images of people, electric power pylons, telegraph poles, tractors, and harvesters in a farmland environment was constructed. The training set comprised 4771 images, while the validation and test sets each consisted of 596 images. The results from the experiments indicated that the enhanced model attained precision, recall, and average precision scores of 85.5%, 75.1%, and 82.5%, respectively, on the custom dataset. This reflected increases of 1.3, 1.2, and 1.9 percentage points when compared to the baseline YOLOv8 model. Furthermore, the model reached 52 detection frames per second, thereby significantly enhancing the detection performance for common obstacles in the field. The model enhanced by the previously mentioned techniques guarantees a high level of detection accuracy while meeting the criteria for real-time obstacle identification in unmanned agricultural equipment during fieldwork.