2021
DOI: 10.3390/s21062140
|View full text |Cite
|
Sign up to set email alerts
|

Sensor and Sensor Fusion Technology in Autonomous Vehicles: A Review

Abstract: With the significant advancement of sensor and communication technology and the reliable application of obstacle detection techniques and algorithms, automated driving is becoming a pivotal technology that can revolutionize the future of transportation and mobility. Sensors are fundamental to the perception of vehicle surroundings in an automated driving system, and the use and performance of multiple integrated sensors can directly determine the safety and feasibility of automated driving vehicles. Sensor cal… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
133
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 485 publications
(219 citation statements)
references
References 98 publications
0
133
0
Order By: Relevance
“…Due to the complexity of the camera projection model and the difficulty of assembling high-quality sensors, the accurate calibration of panoramic vision systems is more challenging than in ordinary vision systems [ 38 ]. Scale information is extracted from the environment, and panoramic systems calibration is the first necessary step to determine the mapping relationship between the environment point coordinates and image pixel coordinates, and to correct these errors in the application of panoramic vision systems.…”
Section: Background and Related Workmentioning
confidence: 99%
“…Due to the complexity of the camera projection model and the difficulty of assembling high-quality sensors, the accurate calibration of panoramic vision systems is more challenging than in ordinary vision systems [ 38 ]. Scale information is extracted from the environment, and panoramic systems calibration is the first necessary step to determine the mapping relationship between the environment point coordinates and image pixel coordinates, and to correct these errors in the application of panoramic vision systems.…”
Section: Background and Related Workmentioning
confidence: 99%
“…point cloud data from LiDARs) can also be used for object detection and tracking. A comparison of the three major sensors used for environment perception is provided in Table 1, which is compiled by reviewing multiple sources (Hasch et al, 2012;Murad et al, 2013;Patole et al, 2017;Campbell et al, 2018;Lin and Zhang, 2020;Lu et al, 2020;Wang et al, 2020;Zaarane et al, 2020;Yeong et al, 2021).…”
Section: Performance Metrics For Environment Perceptionmentioning
confidence: 99%
“…Moreover, sharing raw sensor data can contribute to liability problems in the case of accidents and improving the accuracy of object localization [ 20 ]. In [ 24 ], it discusses sharing raw and processed sensor data from the viewpoint of sensor fusion. High-level fusion, which shares the results of detection and tracking algorithm carried out by each sensor, can be realized under lower complexity and requires few communication resources.…”
Section: Related Workmentioning
confidence: 99%