2023
DOI: 10.1109/access.2023.3311359
|View full text |Cite
|
Sign up to set email alerts
|

Improved Multi-Sensor Fusion Positioning System Based on GNSS/LiDAR/Vision/IMU With Semi-Tight Coupling and Graph Optimization in GNSS Challenging Environments

Jiaming Zhu,
Han Zhou,
Ziyi Wang
et al.

Abstract: With the development of autonomous driving, precise positioning capabilities are becoming increasingly important. GNSS (Global Navigation Satellite System) is normally utilized for vehicle positioning, but is susceptible to factors such as urban canyons, especially in increasingly urbanized scenario nowadays. The interpretation of relative positioning information by means of multi-source sensors such as LiDAR (Light Detection And Ranging) or camera, has also been widely investigated, but there are deficiencies… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 34 publications
0
5
0
Order By: Relevance
“…Like our proposed research, in [9], the authors presented a multi-sensor fusion to estimate the vehicle's trajectory utilizing an odometer, IMU, and vision camera for vehicle localization, achieving a low MSE score when comparing the pose estimation. However, multi-sensor (GNSS, IMU LiDAR and visual camera) approaches require high processing and power demands, as demonstrated in [24]. In contrast, our work used GPS and an RGB-D sensor for vehicle localization in exclusive lanes, dealing with fewer processing and synchronization requirements.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Like our proposed research, in [9], the authors presented a multi-sensor fusion to estimate the vehicle's trajectory utilizing an odometer, IMU, and vision camera for vehicle localization, achieving a low MSE score when comparing the pose estimation. However, multi-sensor (GNSS, IMU LiDAR and visual camera) approaches require high processing and power demands, as demonstrated in [24]. In contrast, our work used GPS and an RGB-D sensor for vehicle localization in exclusive lanes, dealing with fewer processing and synchronization requirements.…”
Section: Discussionmentioning
confidence: 99%
“…Multi-sensor data are used for simultaneous localization and mapping (SLAM), representing the fundamental tasks to solve in autonomous vehicles. SLAM consists of constructing a map of an unknown environment while simultaneously inferring the vehicle's pose within the map [24]. The primary sensors employed for SLAM are LiDAR and cameras.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Under the effective constraints of the INS, more reliable gross error detection can be achieved, exhibiting a good complementarity between the two systems. Among the combinations of the two, there are common configurations such as loosely coupled (LC), tightly coupled (TC), deeply coupled (DC) [6], and the recently proposed semi-tightly coupled (STC) [7]. In all these configurations, the INS serves to constrain the positioning results of the GNSS.…”
Section: Introductionmentioning
confidence: 99%
“…As the number of satellites in orbit increases, the precision of satellite navigation has greatly improved. However, satellite navigation is susceptible to harsh environments, such as tunnels and urban canyons [2][3][4]. Cooperative vehicle infrastructure systems (CVIS) are widely applied to make up for the shortcomings of satellite positioning [5].…”
Section: Introductionmentioning
confidence: 99%