Realizing autonomous inspection, such as that of power distribution lines, through unmanned aerial vehicle (UAV) systems is a key research domain in robotics. In particular, the use of autonomous and semi-autonomous vehicles to execute the tasks of an inspection process can enhance the efficacy and safety of the operation; however, many technical problems, such as those pertaining to the precise positioning and path following of the vehicles, robust obstacle detection, and intelligent control, must be addressed. In this study, an innovative architecture involving an unmanned aircraft vehicle (UAV) and an unmanned ground vehicle (UGV) was examined for detailed inspections of power lines. In the proposed strategy, each vehicle provides its position information to the other, which ensures a safe inspection process. The results of real-world experiments indicate a satisfactory performance, thereby demonstrating the feasibility of the proposed approach.
The recent advances in precision agriculture are due to the emergence of modern robotics systems. For instance, unmanned aerial systems (UASs) give new possibilities that advance the solution of existing problems in this area in many different aspects. The reason is due to these platforms’ ability to perform activities at varying levels of complexity. Therefore, this research presents a multiple-cooperative robot solution for UAS and unmanned ground vehicle (UGV) systems for their joint inspection of olive grove inspect traps. This work evaluated the UAS and UGV vision-based navigation based on a yellow fly trap fixed in the trees to provide visual position data using the You Only Look Once (YOLO) algorithms. The experimental setup evaluated the fuzzy control algorithm applied to the UAS to make it reach the trap efficiently. Experimental tests were conducted in a realistic simulation environment using a robot operating system (ROS) and CoppeliaSim platforms to verify the methodology’s performance, and all tests considered specific real-world environmental conditions. A search and landing algorithm based on augmented reality tag (AR-Tag) visual processing was evaluated to allow for the return and landing of the UAS to the UGV base. The outcomes obtained in this work demonstrate the robustness and feasibility of the multiple-cooperative robot architecture for UGVs and UASs applied in the olive inspection scenario.
Unmanned aerial vehicles (UAV) are a suitable solution for monitoring growing cultures due to the possibility of covering a large area and the necessity of periodic monitoring. In inspection and monitoring tasks, the UAV must find an optimal or near-optimal collision-free route given initial and target positions. In this sense, path-planning strategies are crucial, especially online path planning that can represent the robot’s operational environment or for control purposes. Therefore, this paper proposes an online adaptive path-planning solution based on the fusion of rapidly exploring random trees (RRT) and deep reinforcement learning (DRL) algorithms applied to the generation and control of the UAV autonomous trajectory during an olive-growing fly traps inspection task. The main objective of this proposal is to provide a reliable route for the UAV to reach the inspection points in the tree space to capture an image of the trap autonomously, avoiding possible obstacles present in the environment. The proposed framework was tested in a simulated environment using Gazebo and ROS. The results showed that the proposed solution accomplished the trial for environments up to 300 m3 and with 10 dynamic objects.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.