Accurate system modeling and identification gain importance as tasks executed by autonomously acting unmanned aerial vehicles (UAVs) get more complex and demanding.This paper presents a Bayesian filter approach to online and continuously identify the system parameters, sensor suite calibration states, and vehicle navigation states in a holistic framework. Previous work only tackles subsets of the overall state vector during dedicated phases (e.g., motionless, online during flight, post-processing). These works often introduce the artificial so-called body frame forcing assumptions on system states, such as the inertia matrix's principal axes orientation. Our approach estimates the entire state vector in the (usually not precisely known) center of mass, eliminating several assumptions caused by the artificially introduced body frame in other work. Since our approach also estimates geometric states such as the rotor and sensor placements, no hand-made measures to the unknown center of mass are required -the system is fully self-calibrating. A detailed discussion on the system's observability reveals additionally required (different) measurements for a theoretical and a real N -arm multicopter. We show that easy and precise hand-measurable quantities in real applications can provide the required information. Statistically relevant simulations in Gazebo/RotorS providing ground truth for all states yet having realistic physics validate all our findings.
While low-level auto pilot stacks for aerial vehicles focus on robust control, sensing, and estimation, the continuous advancement of higher-level autonomy for aerial vehicles requires much more complex higher-level flight stacks in order to enable safe, fully autonomous long-duration missions. Rather than focusing on the low-level control, high-level flight stacks are required to monitor the system's integrity continuously, initiate contingency plans, execute mission plans and adapt them in nonnominal situations, allow for proper data logging, and provide standardized interfaces and integrity verification for external mission planners and localization modules.To that end, we present our freely available, high-level flight stack (dubbed CNS Flight Stack) that meets the above requirements and at the same time a) is platform-agnostic through a generalized (embedded) hardware abstraction layer, b) uses low compute complexity for online use on embedded hardware, and c) can be extended with other sensor modalities, integrity checks, and mission modules. These additional properties make it reproducible on a variety of different platforms for safe and fully autonomous applications.We tested the proposed flight stack in over 450 real-world flights and report the failure modes our framework detected and also mitigated to avoid crashes of the aerial system.
For real-world applications, autonomous mobile robotic platforms must be capable of navigating safely in a multitude of different and dynamic environments with accurate and robust localization being a key prerequisite. To support further research in this domain, we present the INSANE data sets -a collection of versatile Micro Aerial Vehicle (MAV) data sets for cross-environment localization. The data sets provide various scenarios with multiple stages of difficulty for localization methods. These scenarios range from trajectories in the controlled environment of an indoor motion capture facility, to experiments where the vehicle performs an outdoor maneuver and transitions into a building, requiring changes of sensor modalities, up to purely outdoor flight maneuvers in a challenging Mars analog environment to simulate scenarios which current and future Mars helicopters would need to perform. The presented work aims to provide data that reflects real-world scenarios and sensor effects. The extensive sensor suite includes various sensor categories, including multiple Inertial Measurement Units (IMUs) and cameras. Sensor data is made available as raw measurements and each data set provides highly accurate ground truth, including the outdoor experiments where a dual Real-Time Kinematic (RTK) Global Navigation Satellite System (GNSS) setup provides subdegree and centimeter accuracy (1-sigma). The sensor suite also includes a dedicated high-rate IMU to capture all the vibration dynamics of the vehicle during flight to support research on novel machine learning-based sensor signal enhancement methods for improved localization. The data sets and post-processing tools are available at: https://sst.aau.at/cns/datasets
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.