SAE Technical Paper Series 2018
DOI: 10.4271/2018-01-1608
|View full text |Cite
|
Sign up to set email alerts
|

Camera-Radar Data Fusion for Target Detection via Kalman Filter and Bayesian Estimation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
7
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 13 publications
(7 citation statements)
references
References 9 publications
0
7
0
Order By: Relevance
“…A number of fusion methods based on radar and camera sensors have been proposed [24], [25], [26], [27], [28], [29]. In [24], in order to improve the accuracy of target detection, the author proposed a Camera Radar Fusion-Net (CRF-NET) which is employed to fuse the data of camera and radar, pointing out a new direction for sensor data fusion work.…”
Section: B Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…A number of fusion methods based on radar and camera sensors have been proposed [24], [25], [26], [27], [28], [29]. In [24], in order to improve the accuracy of target detection, the author proposed a Camera Radar Fusion-Net (CRF-NET) which is employed to fuse the data of camera and radar, pointing out a new direction for sensor data fusion work.…”
Section: B Related Workmentioning
confidence: 99%
“…The camera calibration algorithm is utilized to combine the radar targets and video vehicle detection targets, simultaneously measure the speed of multiple vehicles. In [26], the main idea of sensor fusion is to extract image patches according to regions of interest (ROIs), generated by radar points in camera coordinates. Finally, the target data are fused based on Bayesian estimation.…”
Section: B Related Workmentioning
confidence: 99%
“…Secondly, the track data from each sensor decides whether fused track state and the new incoming track data is admissible inside gating using the Mahalanobis distance which is derived from χ 2 -distribution [25]. Finally, the track update process follows the work well accomplished by the use of the Kalman Filter [26]. In the track update process, there are three states which are track creation, standby, and confirmed.…”
Section: Vehicle Trackermentioning
confidence: 99%
“…The complementary nature of these sensors has inspired previous approaches [ 5 ]. Wang et al [ 6 ] utilized radar detections to guide object search in images, while Yu et al [ 7 ] presented a high-level fusion of camera and radar sensors, leveraging deep learning for object detection and Kalman filtering for tracking. Although recent research tends to favor the elimination of radar sensors in favor of relying solely on camera sensors for advanced and autonomous driving functions, as mentioned by the authors in their review [ 8 ], it is important to note that camera sensors still have significant limitations, particularly in the distance and velocity estimation.…”
Section: Introductionmentioning
confidence: 99%