2017
DOI: 10.1016/j.trpro.2017.12.032
|View full text |Cite
|
Sign up to set email alerts
|

V2V-Communication, LiDAR System and Positioning Sensors for Future Fusion Algorithms in Connected Vehicles

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
21
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 31 publications
(21 citation statements)
references
References 11 publications
0
21
0
Order By: Relevance
“…To evaluate the performance of the proposed vehicle detection method, we compared the vehicle region selected by the ground-truth method and the vehicle region detected by the proposed method using Equation (3). The performance of the proposed vehicle detector was evaluated using precision and recall.…”
Section: Results Of Vehicle Detectionmentioning
confidence: 99%
See 2 more Smart Citations
“…To evaluate the performance of the proposed vehicle detection method, we compared the vehicle region selected by the ground-truth method and the vehicle region detected by the proposed method using Equation (3). The performance of the proposed vehicle detector was evaluated using precision and recall.…”
Section: Results Of Vehicle Detectionmentioning
confidence: 99%
“…Automakers are developing self-driving vehicles and are selling vehicles with safe driving devices to support the driver. For safe autonomous driving, a fusion of various technologies is needed along with vehicle IT technology [1][2][3][4]. For vehicle IT technology, the sensors include laser, radar, ultrasonic wave, lidar, charge coupled device (CCD) sensors, and additional devices for communication among vehicles.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Thirdly, according to the imaging similarity principle, the coordinates of the feature points in pixel coordinate system can be obtained using the constraint that the collinear relationship and the distance division ratio remain unchanged among three points before and after imaging. Finally, the coordinates of all feature points in LiDAR coordinate system and camera pixel coordinate system are brought into Equation (5) to solve the transformation matrix Q and the intrinsic and external parameters. Through the above three steps, the corresponding coordinates of the selected feature points in LiDAR coordinate system and camera pixel coordinate system can be obtained.…”
Section: Calculation Of the Coordinates Of Feature Points In Camera Pmentioning
confidence: 99%
“…Heterogeneous multi-sensor data fusion has extensive research and applications in mobile robots [1], driverless cars [2], and other fields. Compared with a single-sensor system, multi-sensor fusion systems [3] can provide richer environmental information and complete higher-level tasks such as target detection [4], autonomous location [5], and path planning [6]. The combined application of 3D light detection and ranging (LiDAR) and camera is very common.…”
Section: Introductionmentioning
confidence: 99%