2017 IEEE 20th International Conference on Intelligent Transportation Systems (ITSC) 2017
DOI: 10.1109/itsc.2017.8317829
|View full text |Cite
|
Sign up to set email alerts
|

Automatic extrinsic calibration for lidar-stereo vehicle sensor setups

Abstract: Sensor setups consisting of a combination of 3D range scanner lasers and stereo vision systems are becoming a popular choice for on-board perception systems in vehicles; however, the combined use of both sources of information implies a tedious calibration process. We present a method for extrinsic calibration of lidar-stereo camera pairs without user intervention. Our calibration approach is aimed to cope with the constraints commonly found in automotive setups, such as low-resolution and specific sensor pose… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
108
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 145 publications
(109 citation statements)
references
References 16 publications
1
108
0
Order By: Relevance
“…Three pairs of normal directions and checkerboard poses are enough to calculate the extrinsic parameters. By identifying multiple co-planner 3D positions from 3D pointclouds, Guindel et al [12] reduced the minimum number of data required for calibration. Many other variations of target based methods [8], [13] have been proposed to address the limitations of the Zhang [11] method, however, due to their requirement of dedicated artificial target objects for calibration, they are not suitable for in-situ calibration where such a calibration target is not available.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Three pairs of normal directions and checkerboard poses are enough to calculate the extrinsic parameters. By identifying multiple co-planner 3D positions from 3D pointclouds, Guindel et al [12] reduced the minimum number of data required for calibration. Many other variations of target based methods [8], [13] have been proposed to address the limitations of the Zhang [11] method, however, due to their requirement of dedicated artificial target objects for calibration, they are not suitable for in-situ calibration where such a calibration target is not available.…”
Section: Related Workmentioning
confidence: 99%
“…Temporal calibration is also an important part of multimodal sensor fusion but is often not addressed in previous literature for the reason that many of the systems are either aiming for static scanning [16] or short-term operation [18]. However, in many practical applications in robotics, the system continuously receives asynchronous estimations from LiDAR and a not connected or unsynchronised visible camera while in motion [12]. This is especially the case for hand-held systems where the angular velocity is rapidly changing.…”
Section: Related Workmentioning
confidence: 99%
“…There are many methods to estimate the relative pose of a camera with respect to a lidar sensor [3], [4], [6]- [8], [10], [17]. Dhall et al [10] use a square plate as calibration target, which has an ArUco marker (a square black-white pattern) [18] to facilitate pose estimation in monocular camera.…”
Section: A Multi-modal Calibration With Lidar Camera and Radarmentioning
confidence: 99%
“…Up till now, existing calibration tools only addressed pairwise sensor calibrations of maximally two sensing modalities [3]- [12]. Since each modality has a different measurement principle, each proposed calibration procedure used different target designs.…”
Section: Introductionmentioning
confidence: 99%
“…They require a specific calibration setup with multiple checkerboard targets. Simpler techniques that can solve for the LiDAR-camera extrinsic parameters, using simple easy-to-make targets and lesser number of correspondences have been recently proposed in [8], [9]. Although accurate, these techniques are slow, labor intensive, and require careful tuning of several hyperparameters.…”
Section: Related Workmentioning
confidence: 99%