Fruit and vegetable picking robots are affected by the complex orchard environment, resulting in poor recognition and segmentation of target fruits by the vision system. The orchard environment is complex and changeable. For example, the change of light intensity will lead to the unclear surface characteristics of the target fruit; the target fruits are easy to overlap with each other and blocked by branches and leaves, which makes the shape of the fruits incomplete and difficult to accurately identify and segment one by one. Aiming at various difficulties in complex orchard environment, a two-stage instance segmentation method based on the optimized mask region convolutional neural network (mask RCNN) was proposed. The new model proposed to apply the lightweight backbone network MobileNetv3, which not only speeds up the model but also greatly improves the accuracy of the model and meets the storage resource requirements of the mobile robot. To further improve the segmentation quality of the model, the boundary patch refinement (BPR) post-processing module is added to the new model to optimize the rough mask boundaries of the model output to reduce the error pixels. The new model has a high-precision recognition rate and an efficient segmentation strategy, which improves the robustness and stability of the model. This study validates the effect of the new model using the persimmon dataset. The optimized mask RCNN achieved mean average precision (mAP) and mean average recall (mAR) of 76.3 and 81.1%, respectively, which are 3.1 and 3.7% improvement over the baseline mask RCNN, respectively. The new model is experimentally proven to bring higher accuracy and segmentation quality and can be widely deployed in smart agriculture.
In the complex orchard environment, the efficient and accurate detection of object fruit is the basic requirement to realize the orchard yield measurement and automatic harvesting. Sometimes it is hard to differentiate between the object fruits and the background because of the similar color, and it is challenging due to the ambient light and camera angle by which the photos have been taken. These problems make it hard to detect green fruits in orchard environments. In this study, a two-stage dense to detection framework (D2D) was proposed to detect green fruits in orchard environments. The proposed model was based on multi-scale feature extraction of target fruit by using feature pyramid networks MobileNetV2 +FPN structure and generated region proposal of target fruit by using Region Proposal Network (RPN) structure. In the regression branch, the offset of each local feature was calculated, and the positive and negative samples of the region proposals were predicted by a binary mask prediction to reduce the interference of the background to the prediction box. In the classification branch, features were extracted from each sub-region of the region proposal, and features with distinguishing information were obtained through adaptive weighted pooling to achieve accurate classification. The new proposed model adopted an anchor-free frame design, which improves the generalization ability, makes the model more robust, and reduces the storage requirements. The experimental results of persimmon and green apple datasets show that the new model has the best detection performance, which can provide theoretical reference for other green object detection.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.