In recent years, high-speed navigation and environment interaction in the context of aerial robotics has become a field of interest for several academic and industrial research studies. In particular, Search and Intercept (SaI) applications for aerial robots pose a compelling research area due to their potential usability in several environments. Nevertheless, SaI tasks involve a challenging development regarding sensory weight, onboard computation resources, actuation design, and algorithms for perception and control, among others. In this work, a fully autonomous aerial robot for high-speed object grasping has been proposed. As an additional subtask, our system is able to autonomously pierce balloons located in poles close to the surface. Our first contribution is the design of the aerial robot at an actuation and sensory level consisting of a novel gripper design with additional sensors enabling the robot to grasp objects at high speeds. The second contribution is a complete software framework consisting of perception, state estimation, motion planning, motion control, and mission control in order to rapidly and robustly perform the autonomous grasping mission. Our approach has been validated in a challenging international competition and has shown outstanding results, being able to autonomously search, follow, and grasp a moving object at 6 m/s in an outdoor environment.
This paper presents a novel approach for accurate counting and localization of tropical plants in aerial images that is able to work in new visual domains in which the available data is not labeled. Our approach uses deep learning and domain adaptation, designed to handle domain shift between the training and test data, which is a common challenge in this agricultural applications. This method uses a source dataset with annotated plants and a target dataset without annotations, and adapts a model trained on the source dataset to the target dataset using unsupervised domain alignment and pseudolabeling. The experimental results show the effectiveness of this approach for plant counting in aerial images of pineapples under significative domain shift, achieving a reduction up to 97% in the counting error when compared to the supervised baseline.
In recent years, the use of unmanned aerial vehicles has spread across different fields of the industry due to their ease of deployment and minimal operational risk. Firefighting is a dangerous task for the humans involved, in which the use of UAVs presents itself as a good first-action protocol for a rapid response to an incipient fire because of their safety and speed of action. Current research is mainly focused on wildland fires, but fires in urban environments are barely mentioned in the bibliography. To motivate the research on this topic, ICUAS’22 organized an international competition inspired by this mission, with the challenge of a UAV traversing an area populated by obstacles, finding a target, and precisely throwing a ball to it. For this competition, the Computer Vision and Aerial Robotics (CVAR-UPM) team developed a solution composed of multiple modules and structured by a mission planner. In this paper, we describe our approach and the developed architecture that led us to be awarded the first prize in the competition.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.