2023
DOI: 10.3390/jimaging9020052
|View full text |Cite
|
Sign up to set email alerts
|

LiDAR-Based Sensor Fusion SLAM and Localization for Autonomous Driving Vehicles in Complex Scenarios

Abstract: LiDAR-based simultaneous localization and mapping (SLAM) and online localization methods are widely used in autonomous driving, and are key parts of intelligent vehicles. However, current SLAM algorithms have limitations in map drift and localization algorithms based on a single sensor have poor adaptability to complex scenarios. A SLAM and online localization method based on multi-sensor fusion is proposed and integrated into a general framework in this paper. In the mapping process, constraints consisting of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 33 publications
0
3
0
Order By: Relevance
“…By incorporating the vision-, radar-, or LiDAR sensors, they can provide additional position measurement when the GNSS data is not available (inside the tunnel). For example, in the work done by Dai et al (2023), they estimated the vehicle motion by combining both LiDAR and real-time kinematic GNSS. Initially, the Normal Distributions Transform (NDT) algorithm was used to register the point clouds.…”
Section: Resultsmentioning
confidence: 99%
“…By incorporating the vision-, radar-, or LiDAR sensors, they can provide additional position measurement when the GNSS data is not available (inside the tunnel). For example, in the work done by Dai et al (2023), they estimated the vehicle motion by combining both LiDAR and real-time kinematic GNSS. Initially, the Normal Distributions Transform (NDT) algorithm was used to register the point clouds.…”
Section: Resultsmentioning
confidence: 99%
“…Line features are commonly used to represent environments in which straight or flat structures are dominant. These features are often observed in maps constructed by means of LiDAR sensors [15], cameras [16] and RGB-D sensors [17]. In the existing literature, fusion algorithms for line features can be categorized into the following two major types: offline algorithms [18][19][20] and online algorithms [21][22][23][24].…”
Section: Literature Reviewmentioning
confidence: 99%
“…Biological studies have indicated that organisms such as mantis shrimp [7], octopuses [8], locust [9], and desert ants [10] can utilize their unique visual structures to perceive the polarization patterns of the entire sky, employing this information for self-orientation. Inspired by biomimetic navigation mechanisms, polarized navigation has demonstrated advanced performance in unmanned aerial vehicles [11,12], unmanned ground vehicles [13][14][15], and other navigation domains. With advantages such as no cumulative errors, resistance to electromagnetic interference, and robust concealment, it provides a novel solution for fully autonomous orientation in Global Navigation Satellite System (GNSS)-denied environments [16,17].…”
Section: Introductionmentioning
confidence: 99%