2018 IEEE International Conference on Robotics and Automation (ICRA) 2018
DOI: 10.1109/icra.2018.8462915
|View full text |Cite
|
Sign up to set email alerts
|

Elastic LiDAR Fusion: Dense Map-Centric Continuous-Time SLAM

Abstract: The concept of continuous-time trajectory representation has brought increased accuracy and efficiency to multi-modal sensor fusion in modern SLAM. However, regardless of these advantages, its offline property caused by the requirement of global batch optimization is critically hindering its relevance for real-time and life-long applications. In this paper, we present a dense map-centric SLAM method based on a continuous-time trajectory to cope with this problem. The proposed system locally functions in a simi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
65
0
2

Year Published

2018
2018
2023
2023

Publication Types

Select...
6
1
1

Relationship

4
4

Authors

Journals

citations
Cited by 95 publications
(67 citation statements)
references
References 18 publications
0
65
0
2
Order By: Relevance
“…2) Experiments with Real Data: We also performed experiments with real data to evaluate the accuracy of the closeform solution for estimating extrinsic parameters. The mono camera trajectories were estimated by visual odometry [22], while the continuous-time trajectory of the LiDAR sensor were estimated by our previous work [2], [21]. Prior to the closed-form solution calculation, two trajectories are roughly synchronized automatically by detecting the start of the motion using the rotation velocity of the LiDAR trajectory and mean feature movement in the 2D image space as depicted in Fig.…”
Section: ) Experiments With Simulated Datamentioning
confidence: 99%
“…2) Experiments with Real Data: We also performed experiments with real data to evaluate the accuracy of the closeform solution for estimating extrinsic parameters. The mono camera trajectories were estimated by visual odometry [22], while the continuous-time trajectory of the LiDAR sensor were estimated by our previous work [2], [21]. Prior to the closed-form solution calculation, two trajectories are roughly synchronized automatically by detecting the start of the motion using the rotation velocity of the LiDAR trajectory and mean feature movement in the 2D image space as depicted in Fig.…”
Section: ) Experiments With Simulated Datamentioning
confidence: 99%
“…1. Given a motionundistorted local point cloud from LiDAR and corresponding camera images by the sliding window-based local trajectory optimization [2] as shown Fig. 2, the place recognition module finds a possible revisit of a place and triggers the localization step to estimate an alignment between the current location and the reference location.…”
Section: Overview Of the Systemmentioning
confidence: 99%
“…In the proposed method, the synchronization problem of the multi-modal sensors such as different frame-rates or unsynchronized clocks are required to be properly addressed. To cope with this problem, our method proposes to use the continuous-time trajectory representation [2]. In the continuous-time trajectory representation, the trajectory is modeled as a function of time.…”
Section: A Continuous-time Trajectory Representationmentioning
confidence: 99%
See 1 more Smart Citation
“…Volumetric 3D reconstruction for rigid scenes and objects is a well studied problem in computer vision and robotics [1,2,3]. Often reconstructed 3D maps are fused with other complementary modalities such as RGB information [4], non-visible imaging information such as thermal-infared [5,6,7] or sound [8] for application such as medical imaging [9], disaster response [10] and energy auditing [11].…”
Section: Introductionmentioning
confidence: 99%