2021
DOI: 10.1016/j.eswa.2021.114894
|View full text |Cite
|
Sign up to set email alerts
|

A Camera to LiDAR calibration approach through the optimization of atomic transformations

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(4 citation statements)
references
References 42 publications
0
4
0
Order By: Relevance
“… represents the extrinsic parameters between the two sensors, consisting of a rotation matrix and a translation matrix. These extrinsic parameters, which indicate the relative position and direction between the two sensors, can be obtained through calibration [ 22 ]. …”
Section: Preliminariesmentioning
confidence: 99%
“… represents the extrinsic parameters between the two sensors, consisting of a rotation matrix and a translation matrix. These extrinsic parameters, which indicate the relative position and direction between the two sensors, can be obtained through calibration [ 22 ]. …”
Section: Preliminariesmentioning
confidence: 99%
“…More recently, [ 8 ] presents a simultaneous general approach for hand–eye calibration for multiple cameras, based on optimization of atomic transformations dubbed ATOM. In [ 9 ], the authors build upon ATOM to allow multimodal sensors, but the method still necessitates a camera due to reprojection error being used as optimization metric. Both calibration methods are target-based and the sensors must share field of view.…”
Section: Related Workmentioning
confidence: 99%
“…More recently, [8] presents a simultaneous general approach for handeye calibration for multiple cameras, based on optimization of atomic transformations dubbed ATOM. In [9], the authors build upon ATOM to allow multimodal sensors, but the method still necessitates a camera due to reprojection error being used as optimization metric. Both calibration methods are target based and the sensors must share field of view.…”
Section: Related Workmentioning
confidence: 99%