2019
DOI: 10.1109/lra.2019.2922618
|View full text |Cite
|
Sign up to set email alerts
|

Automatic Multi-Sensor Extrinsic Calibration For Mobile Robots

Abstract: In order to fuse measurements from multiple sensors mounted on a mobile robot, it is needed to express them in a common reference system through their relative spatial transformations. In this paper, we present a method to estimate the full 6DoF extrinsic calibration parameters of multiple heterogeneous sensors (Lidars, Depth and RGB cameras) suitable for automatic execution on a mobile robot. Our method computes the 2D calibration parameters (x, y, yaw) through a motion-based approach, while for the remaining… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3
1

Relationship

2
6

Authors

Journals

citations
Cited by 23 publications
(10 citation statements)
references
References 26 publications
0
10
0
Order By: Relevance
“…For the special case of indoor robotics and autonomous vehicles with planar motion, not all calibration parameters are observable since the rotation axes of all transformations are parallel [7]. Therefore, the non-observable parameter z as well as the roll and pitch angle are often estimated from a common ground plane [15], [18], [23] and only the remaining parameters x, y, and yaw angle are estimated from per-sensor motion. Since our approach is based on DQs, we can restrict the optimization on planar calibration only, i.e., x, y, and yaw angle, for the cost of only two additional constraints.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…For the special case of indoor robotics and autonomous vehicles with planar motion, not all calibration parameters are observable since the rotation axes of all transformations are parallel [7]. Therefore, the non-observable parameter z as well as the roll and pitch angle are often estimated from a common ground plane [15], [18], [23] and only the remaining parameters x, y, and yaw angle are estimated from per-sensor motion. Since our approach is based on DQs, we can restrict the optimization on planar calibration only, i.e., x, y, and yaw angle, for the cost of only two additional constraints.…”
Section: Related Workmentioning
confidence: 99%
“…This also includes the scaling for monocular camera motion estimation. If sensors do not provide simultaneous measurements but provide their data in a common time frame, the motion estimations can be interpolated, for example, similar to [23]. Further, all noise, if not stated differently, is assumed to originate from additive white Gaussian processes.…”
Section: Problem Formulationmentioning
confidence: 99%
“…with near zero pitch. The extrinsic calibration parameters between the sensors were estimated using the automatic multi-sensor method proposed in [25]. From the two recorded sequences, one was used to perform the depth calibration described in Section 4, while the other one was used for evaluation purposes.…”
Section: Giraff Robotmentioning
confidence: 99%
“…Furthermore, the system is re-calibrated every time before the data acquisition as discussed in [19] which ensures that every numerical discrepancy due to wear and tear is captured and the fusion framework uses that information to achieve higher accuracy. Multi-sensor calibration is a crucial research problem which indirectly affects 3D fusion process, however to keep the focus of this research towards 3D reconstruction, readers are encouraged to see [20]- [22]). Similarly, hardware as well as software considerations have been incorporated by Geiger et.…”
Section: A Camera Setupmentioning
confidence: 99%