2020
DOI: 10.1109/msp.2020.2985815
|View full text |Cite
|
Sign up to set email alerts
|

Event-Based Neuromorphic Vision for Autonomous Driving: A Paradigm Shift for Bio-Inspired Visual Sensing and Perception

Abstract: s a bio-inspired and emerging sensor, an event-based neuromorphic vision sensor has a different working principle compared to the standard frame-based cameras, which leads to promising properties of low energy consumption, low latency, high dynamic range (HDR), and high temporal resolution. It poses a paradigm shift to sense and perceive the environment by capturing local pixel-level light intensity changes and producing asynchronous event streams. Advanced technologies for the visual sensing system of autonom… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
111
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
10

Relationship

3
7

Authors

Journals

citations
Cited by 210 publications
(112 citation statements)
references
References 47 publications
0
111
0
1
Order By: Relevance
“…However, in the current research, it is found that deep learning is only not good at dealing with the mathematical problems in SLAM, but also has the problem of insufficient calibration data sets, resulting in the inability of the detection accuracy of moving objects for excellent performance. Therefore, it is impossible to completely replace the traditional SLAM target detection module with deep learning and neuromorphic vision sensor [39] at this stage.…”
Section: It Can Be Learned From Human's Common Sense Andmentioning
confidence: 99%
“…However, in the current research, it is found that deep learning is only not good at dealing with the mathematical problems in SLAM, but also has the problem of insufficient calibration data sets, resulting in the inability of the detection accuracy of moving objects for excellent performance. Therefore, it is impossible to completely replace the traditional SLAM target detection module with deep learning and neuromorphic vision sensor [39] at this stage.…”
Section: It Can Be Learned From Human's Common Sense Andmentioning
confidence: 99%
“…Dynamic control is one of the most crucial tasks for autonomous driving vehicle (Chen et al, 2020). H-infinity output feedback controller (Hu et al, 2016), sliding mode controller (Jiang and Wu, 2018), model predictive control (MPC) (Sun et al, 2019), etc.…”
Section: Introductionmentioning
confidence: 99%
“…The researchers of Chen et al ( 2018 ) applied neuromorphic vision sensor in intelligent transportation system. Some works have been reviewed in Chen et al ( 2020 ) and Gallego et al ( 2019 ). Considering the high requirement of calculation time and storage consumption, the traditional RGB-D camera cannot satisfy the real-time feature.…”
Section: Introductionmentioning
confidence: 99%