2022
DOI: 10.1109/tpami.2020.3008413
|View full text |Cite
|
Sign up to set email alerts
|

Event-Based Vision: A Survey

Abstract: Event cameras are bio-inspired sensors that differ from conventional frame cameras: Instead of capturing images at a fixed rate, they asynchronously measure per-pixel brightness changes, and output a stream of events that encode the time, location and sign of the brightness changes. Event cameras offer attractive properties compared to traditional cameras: high temporal resolution (in the order of µs), very high dynamic range (140 dB vs. 60 dB), low power consumption, and high pixel bandwidth (on the order of … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
720
0
7

Year Published

2024
2024
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 1,294 publications
(898 citation statements)
references
References 217 publications
1
720
0
7
Order By: Relevance
“…These kind of sensors can operate in different light intensities, and provide the property of low latency, and power consumption. An extensive review of all the algorithms, applications, and datasets is provided in [27]. Moreover, using heterogeneous platforms facilitates dividing complex tasks in harsh environments and enhance flexibility and mobility.…”
Section: Discussion and Future Research Directionsmentioning
confidence: 99%
“…These kind of sensors can operate in different light intensities, and provide the property of low latency, and power consumption. An extensive review of all the algorithms, applications, and datasets is provided in [27]. Moreover, using heterogeneous platforms facilitates dividing complex tasks in harsh environments and enhance flexibility and mobility.…”
Section: Discussion and Future Research Directionsmentioning
confidence: 99%
“…Rather, our sensor and learning method are both event-driven. 2 Available at https://clear-nus.github.io/visuotactile/ B. Event-based Perception: Sensors and Learning Work on event-based perception has focused primarily on vision (see [29] for a comprehensive survey). This emphasis on vision can be attributed both to its applicability across many tasks, as well as the recent availability of event cameras such as the DVS and Prophesee Onboard.…”
Section: A Visual-tactile Perception For Robotsmentioning
confidence: 99%
“…Event-based sensors have been successfully used in conjunction with deep learning techniques [29]. The binary events are first converted into real-valued tensors, which are processed downstream by deep ANNs.…”
Section: A Visual-tactile Perception For Robotsmentioning
confidence: 99%
“…Due to its unique way of acquiring information from the environment, a paradigm shift is necessary to construct algorithms that accommodate such information. Eventbased SLAM is beyond the scope of this review paper and interested readers are referred to the comprehensive survey in [41].…”
Section: Introductionmentioning
confidence: 99%