Automation of vineyards cultivation necessitates for mobile robots to retain accurate localization system. The paper introduces a stereo vision-based Graph-Simultaneous Localization and Mapping (Graph-SLAM) pipeline customtailored to the specificities of vineyard fields. Graph-SLAM is reinforced with a Loop Closure Detection (LCD) based on semantic segmentation of the vine trees. The Mask R-CNN network is applied to segment the trunk regions of images, on which unique visual features are extracted. These features are used to populate the bag of visual words (BoVWs) retained on the formulated graph. A nearest neighbor search is applied to each query trunk-image to associate each unique feature descriptor with the corresponding node in the graph using a voting procedure. We apply a probabilistic method to select the most suitable loop closing pair and, upon an LCD appearance, the 3D points of the trunks are employed to estimate the loop closure constraint to the graph. The traceable features on trunk segments drastically reduce the number of retained BoVWs, which in turn expedites significantly the loop closure and graph optimization, rendering our method suitable for large scale mapping in vineyards. The pipeline has been evaluated on several data sequences gathered from real vineyards, in different seasons, when the appearance of vine trees vary significantly, and exhibited robust mapping in long distances.
Achieving a robust long‐term deployment with mobile robots in the agriculture domain is both a demanded and challenging task. The possibility to have autonomous platforms in the field performing repetitive tasks, such as monitoring or harvesting crops, collides with the difficulties posed by the always‐changing appearance of the environment due to seasonality. With this scope in mind, we report an ongoing effort in the long‐term deployment of an autonomous mobile robot in a vineyard, with the main objective of acquiring what we called the Bacchus Long‐Term (BLT) data set. This data set consists of multiple sessions recorded in the same area of a vineyard but at different points in time, covering a total of 7 months to capture the whole canopy growth from March until September. The multimodal data set recorded is acquired with the main focus put on pushing the development and evaluations of different mapping and localization algorithms for long‐term autonomous robots operation in the agricultural domain. Hence, besides the data set, we also present an initial study in long‐term localization using four different sessions belonging to four different months with different plant stages. We identify that state‐of‐the‐art localization methods can only cope partially with the amount of change in the environment, making the proposed data set suitable to establish a benchmark on which the robotics community can test its methods. On our side, we anticipate two solutions pointed at extracting stable temporal features for improving long‐term 4D localization results. The BLT data set is available at https://lncn.ac/lcas-blt.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.