2021
DOI: 10.1002/rob.22040
|View full text |Cite
|
Sign up to set email alerts
|

EIL‐SLAM: Depth‐enhanced edge‐based infrared‐LiDAR SLAM

Abstract: Traditional simultaneous localization and mapping (SLAM) approaches that utilize visible cameras or light detection and rangings (LiDARs) frequently fail in dusty, lowtextured, or completely dark environments. To address this problem, this study proposes a novel approach by tightly coupling perception data from a thermal infrared camera and a LiDAR based on the advantages of the former. However, applying a thermal infrared camera directly to existing SLAM frameworks is difficult because of the sensor differenc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
9
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 21 publications
(9 citation statements)
references
References 33 publications
0
9
0
Order By: Relevance
“…Additionally, the dataset captured objects from favorable viewpoints, with each entry containing image sequences, corresponding contours, and complete calibration parameters. For a more scientifically rigorous assessment of the proposed algorithm’s performance, the study conducted comparative experiments with mainstream SLAM algorithms, including Improved SLAM Based on LiDAR (LiDARO-SLAM) and SLAM Based on the Fusion of Binocular Wire Features and Inertial Navigation (BWFIN-SLAM) [ 29 , 30 ].…”
Section: Analysis Of Ims-vslam Algorithm For Wheeled Robotsmentioning
confidence: 99%
“…Additionally, the dataset captured objects from favorable viewpoints, with each entry containing image sequences, corresponding contours, and complete calibration parameters. For a more scientifically rigorous assessment of the proposed algorithm’s performance, the study conducted comparative experiments with mainstream SLAM algorithms, including Improved SLAM Based on LiDAR (LiDARO-SLAM) and SLAM Based on the Fusion of Binocular Wire Features and Inertial Navigation (BWFIN-SLAM) [ 29 , 30 ].…”
Section: Analysis Of Ims-vslam Algorithm For Wheeled Robotsmentioning
confidence: 99%
“…The significance of simultaneous localization and mapping (SLAM) technology and trajectory interpolation for mobile robots and autonomous driving has been increasing due to the continuous development of artificial intelligence technology [1][2][3][4][5][6]. SLAM algorithms and trajectory interpolation have been successfully applied in various fields, including campus inspection, logistics and distribution, and unmanned driving.…”
Section: Introductionmentioning
confidence: 99%
“…However, this method is unsatisfactory as it does not take into account the semantic information of the surrounding environment, which is crucial for human beings to recognize whether a place has been reached or not [9][10]. Unfortunately, there The significance of simultaneous localization and mapping (SLAM) technology and trajectory interpolation for mobile robots and autonomous driving has been increasing due to the continuous development of artificial intelligence technology [1][2][3][4][5][6]. SLAM algorithms and trajectory interpolation have been successfully 2 applied in various fields, including campus inspection, logistics and distribution, and unmanned driving.…”
Section: Introductionmentioning
confidence: 99%
“…HIBAS has important implications for fields that use 3D lidar data in real-time applications, such as robotics and autonomous driving, where the handling of large point clouds is a crucial task. In these fields, the use of advanced sensor technologies, such as lidar, has led to the generation of voluminous and multifaceted point clouds that cannot be applied directly to real-time perception problems (Nguyen et al, 2019), such as simultaneous localisation and mapping (SLAM) (Chen et al, 2022), 3D object reconstruction (Karami et al, 2022), 3D environment exploration (Butkiewicz et al, 2008), object detection (Burkert et al, 2011), classification (Mayr et al, 2017) and segmentation (Awwad et al, 2010). By reducing the size of the point cloud and simplifying the data, the proposed method can facilitate the development of real-time applications that require high accuracy and low computational cost (Han et al, 2022).…”
Section: Introductionmentioning
confidence: 99%