2018
DOI: 10.1002/rob.21831
|View full text |Cite
|
Sign up to set email alerts
|

RTAB‐Map as an open‐source lidar and visual simultaneous localization and mapping library for large‐scale and long‐term online operation

Abstract: Distributed as an open-source library since 2013, real-time appearance-based mapping (RTAB-Map) started as an appearance-based loop closure detection approach with memory management to deal with large-scale and long-term online operation. It then grew to implement simultaneous localization and mapping (SLAM) on various robots and mobile platforms. As each application brings its own set of constraints on sensors, processing capabilities, and locomotion, it raises the question of which SLAM approach is the most … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
393
0
5

Year Published

2019
2019
2022
2022

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 710 publications
(399 citation statements)
references
References 81 publications
1
393
0
5
Order By: Relevance
“…Each robot collects images from an onboard stereo camera and uses a (single-robot) Stereo Visual Odometry module to produce an estimate of its trajectory. In our implementation, we use the stereo odometry from RTAB-Map [45]. The images are also fed to the Distributed Loop Closure Detection module (Section III-A) which communicates information with other robots (when they are within communication range) and outputs inter-robot loop closure measurements.…”
Section: The Door-slam Systemmentioning
confidence: 99%
“…Each robot collects images from an onboard stereo camera and uses a (single-robot) Stereo Visual Odometry module to produce an estimate of its trajectory. In our implementation, we use the stereo odometry from RTAB-Map [45]. The images are also fed to the Distributed Loop Closure Detection module (Section III-A) which communicates information with other robots (when they are within communication range) and outputs inter-robot loop closure measurements.…”
Section: The Door-slam Systemmentioning
confidence: 99%
“…where (x, y) is the position and θ the orientation. This step can exploit any range-based or visual-based localization/SLAM algorithm, notably the provided framework supports and was tested with techniques already available on ROS as Gmapping SLAM [34], AMCL [35] and RTAB-Map library [36] which was initially developed for appearance-based loop closing and memory handling for large-scale scene mapping. These libraries provide localization and mapping techniques for several sensory modalities, including RGB-D, stereo or monocular camera settings for both 2D grid-based representation and the 3D textured point cloud of the scene.…”
Section: Localization and Mappingmentioning
confidence: 99%
“…A single sensor has unique advantages and limitations. Combining multiple sensors can effectively improve the performance of a SLAM system [15]. The commonly used sensor configuration is visual plus IMU, which results in visual inertial SLAM (VI-SLAM).…”
Section: Sensor Fusionmentioning
confidence: 99%