Existing object detection methods with many parameters and computations are not suitable for deployment on devices with poor performance in agricultural environments. Therefore, this study proposes a lightweight crop pest detection method based on convolutional neural networks, named YOLOLite-CSG. The basic architecture of the method is derived from a simplified version of YOLOv3, namely YOLOLite, and k-means++ is utilized to improve the generation process of the prior boxes. In addition, a lightweight sandglass block and coordinate attention are used to optimize the structure of residual blocks. The method was evaluated on the CP15 crop pest dataset. Its detection precision exceeds that of YOLOv3, at 82.9%, while the number of parameters is 5 million, only 8.1% of the number used by YOLOv3, and the number of computations is 9.8 GFLOPs, only 15% of that used by YOLOv3. Furthermore, the detection precision of the method is superior to all other commonly used object detection methods evaluated in this study, with a maximum improvement of 10.6%, and it still has a significant edge in the number of parameters and computation required. The method has excellent pest detection precision with extremely few parameters and computations. It is well-suited to be deployed on equipment for detecting crop pests in agricultural environments.
Crop diseases are one of the important factors affecting crop yield and quality and are also an important research target in the field of agriculture. In order to quickly and accurately identify crop diseases, help farmers to control crop diseases in time, and reduce crop losses. Inspired by the application of convolutional neural networks in image identification, we propose a lightweight crop disease image identification model based on attentional feature fusion named DSGIResNet_AFF, which introduces self-built lightweight residual blocks, inverted residuals blocks, and attentional feature fusion modules on the basis of ResNet18. We apply the model to the identification of rice and corn diseases, and the results show the effectiveness of the model on the real dataset. Additionally, the model is compared with other convolutional neural networks (AlexNet, VGG16, ShuffleNetV2, MobileNetV2, MobileNetV3-Small and MobileNetV3-Large), and the experimental results show that the accuracy, sensitivity, F1-score, AUC of the proposed model DSGIResNet_AFF are 98.30%, 98.23%, 98.24%, 99.97%, respectively, which are better than other network models, while the complexity of the model is significantly reduced (compared with the basic model ResNet18, the number of parameters is reduced by 94.10%, and the floating point of operations(FLOPs) is reduced by 86.13%). The network model DSGIResNet_AFF can be applied to mobile devices and become a useful tool for identifying crop diseases.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.