Summary
Autonomous driving has gradually moved towards practical applications in recent years. It is particularly critical to provide reliable real‐time environmental information for autonomous driving systems. At present, vehicle video surveillance systems based on multi‐source video and target detection algorithms can effectively solve these problems. However, the previous vehicle video surveillance systems are often unable to balance the surveillance effect and the surveillance frame rate. Therefore, we will introduce a vehicle video surveillance system based on parallel computing and computer vision in this article. First, multiple fisheye cameras are used to collect surround‐view environmental information. Second, we will use a low‐light camera, infrared thermal imager, and millimeter‐wave radar to provide forward‐view environmental information at night. Correspondingly, we designed the surround‐view image fusion algorithm and the forward‐view image fusion algorithm based on parallel computing. At last, a monocular camera and detection algorithms are used to provide forward‐view detection results. In a word, this vehicle video surveillance system will benefit the practical application of autonomous driving.