2019 Fifth International Conference on Image Information Processing (ICIIP) 2019
DOI: 10.1109/iciip47207.2019.8985782
|View full text |Cite
|
Sign up to set email alerts
|

Radar and Camera Sensor Fusion with ROS for Autonomous Driving

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
2
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 21 publications
0
2
0
Order By: Relevance
“…A number of fusion methods based on radar and camera sensors have been proposed [24], [25], [26], [27], [28], [29]. In [24], in order to improve the accuracy of target detection, the author proposed a Camera Radar Fusion-Net (CRF-NET) which is employed to fuse the data of camera and radar, pointing out a new direction for sensor data fusion work.…”
Section: B Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…A number of fusion methods based on radar and camera sensors have been proposed [24], [25], [26], [27], [28], [29]. In [24], in order to improve the accuracy of target detection, the author proposed a Camera Radar Fusion-Net (CRF-NET) which is employed to fuse the data of camera and radar, pointing out a new direction for sensor data fusion work.…”
Section: B Related Workmentioning
confidence: 99%
“…Finally, the target data are fused based on Bayesian estimation. In [27], a model implemented using the Robot Operating System (ROS) environment is proposed to handle radar and camera sensor synchronization and fusion for accurate object detection. In [28], a spatio-temporal synchronization method for roadside radar and camera is proposed to fuse targets and match vehicle trajectories.…”
Section: B Related Workmentioning
confidence: 99%
“…Consequently, the camera and radar sensors complement each other in providing the most comprehensive view of the environment through the data they provide to the autonomous vehicle. In [2] and [3], it is stated that sensor fusion is the best approach when it comes to automated vehicles as it allows for a deeper understanding of the surrounding by accounting for different variables. Cameras are indispensable to autonomous vehicles through the visualization of the surrounding, thus facilitating many of the processes essential to self-driving.…”
Section: Extended Abstractmentioning
confidence: 99%
“…However, camera-only based solutions can be computationally taxing as most high-definition cameras produce millions of pixels per framework and require advanced software and hardware capabilities to process them. In [3], the authors discuss how radars can complement cameras by addressing the requirement for localization. However, they generate less angular accuracy and capture less data than the expensive Lidar.…”
Section: Extended Abstractmentioning
confidence: 99%