In this paper, we present a probabilistic framework to recover the extrinsic calibration parameters of a lidar-IMU sensing system. Unlike global-shutter cameras, lidars do not take single snapshots of the environment. Instead, lidars collect a succession of 3D-points generally grouped in scans. If these points are assumed to be expressed in a common frame, this becomes an issue when the sensor moves rapidly in the environment causing motion distortion. The fundamental idea of our proposed framework is to use preintegration over interpolated inertial measurements to characterise the motion distortion in each lidar scan. Moreover, by using a set of planes as a calibration target, the proposed method makes use of lidar point-to-plane distances to jointly calibrate and localise the system using on-manifold optimisation. The calibration does not rely on a predefined target as arbitrary planes are detected and modelled in the first lidar scan. Simulated and real data are used to show the effectiveness of the proposed method.
In this paper, we present INertial Lidar Localisation Autocalibration And MApping (IN2LAAMA): an offline probabilistic framework for localisation, mapping, and extrinsic calibration based on a 3D-lidar and a 6-DoF-IMU. Most of today's lidars collect geometric information about the surrounding environment by sweeping lasers across their field of view. Consequently, 3D-points in one lidar scan are acquired at different timestamps. If the sensor trajectory is not accurately known, the scans are affected by the phenomenon known as motion distortion. The proposed method leverages preintegration with a continuous representation of the inertial measurements to characterise the system's motion at any point in time. It enables precise correction of the motion distortion without relying on any explicit motion model. The system's pose, velocity, biases, and time-shift are estimated via a full batch optimisation that includes automatically generated loop-closure constraints. The autocalibration and the registration of lidar data rely on planar and edge features matched across pairs of scans. The performance of the framework is validated through simulated and real-data experiments.
In this paper, we present INertial Lidar Localisation Autocalibration And MApping (IN2LAAMA): a probabilistic framework for localisation, mapping, and extrinsic calibration based on a 3D-lidar and a 6-DoF-IMU. Most of today's lidars collect geometric information about the surrounding environment by sweeping lasers across their field of view. Consequently, 3Dpoints in one lidar scan are acquired at different timestamps. If the sensor trajectory is not accurately known, the scans are affected by the phenomenon known as motion distortion. The proposed method leverages preintegration with a continuous representation of the inertial measurements to characterise the system's motion at any point in time. It enables precise correction of the motion distortion without relying on any explicit motion model. The system's pose, velocity, biases, and time-shift are estimated via a full batch optimisation that includes automatically generated loop-closure constraints. The autcalibration and the registration of lidar data relies on planar and edge features matched across pairs of scans. The performance of the framework is validated through simulated and real-data experiments.
In this paper, we present Gaussian Process Preintegration, a preintegration theory based on continuous representations of inertial measurements. A novel use of linear operators on Gaussian Process kernels is employed to generate the proposed Gaussian Preintegrated Measurements (GPMs). This formulation allows the analytical integration of inertial signals on any time interval. Consequently, GPMs are especially suited for asynchronous inertial-aided estimation frameworks. Unlike discrete preintegration approaches, the proposed method does not rely on any explicit motion-model and does not suffer from numerical integration noise. Additionally, we provide the analytical derivation of the Jacobians involved in the firstorder expansion for postintegration bias and inter-sensor timeshift correction. We benchmarked the proposed method against existing preintegration methods on simulated data. Our experiments show that GPMs produce the most accurate results and their computation time allows close-to-real-time operations. We validated the suitability of GPMs for inertial-aided estimation by integrating them into a lidar-inertial localisation and mapping framework.
In this paper, we introduce IDOL, an optimization-based framework for IMU-DVS Odometry using Lines. Event cameras, also called Dynamic Vision Sensors (DVSs), generate highly asynchronous streams of events triggered upon illumination changes for each individual pixel. This novel paradigm presents advantages in low illumination conditions and high-speed motions. Nonetheless, this unconventional sensing modality brings new challenges to perform scene reconstruction or motion estimation. The proposed method offers to leverage a continuous-time representation of the inertial readings to associate each event with timely accurate inertial data. The method's front-end extracts event clusters that belong to line segments in the environment whereas the back-end estimates the system's trajectory alongside the lines' 3D position by minimizing point-to-line distances between individual events and the lines' projection in the image space. A novel attraction/repulsion mechanism is presented to accurately estimate the lines' extremities, avoiding their explicit detection in the event data. The proposed method is benchmarked against a state-of-the-art frame-based visual-inertial odometry framework using public datasets. The results show that IDOL performs at the same order of magnitude on most datasets and even shows better orientation estimates. These findings can have a great impact on new algorithms for DVS.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.