2022
DOI: 10.1109/access.2022.3177744
|View full text |Cite
|
Sign up to set email alerts
|

Fast Classification and Action Recognition With Event-Based Imaging

Abstract: The neuromorphic event cameras, which capture the optical changes of a scene, have drawn increasing attention due to their high speed and low power consumption. However, the event data are noisy, sparse, and nonuniform in the spatial-temporal domain with extremely high temporal resolution, making it challenging to process for traditional deep learning algorithms. To enable convolutional neural network models for event vision tasks, most methods encode events into point-cloud or voxel representations, but their… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 12 publications
(2 citation statements)
references
References 45 publications
0
2
0
Order By: Relevance
“…Tracking feature motion using event based sensors in the form of optical flow has been made even easier with newer sensors providing optical flow as a direct event output (e.g CeleX-V) or through processing tool chains. Event processing for optical flow and hence target identification and tracking has been described in the literature [6,7]. It is not hard to imagine the simple optical flow algorithms [9] being adapted to event streams from adjacent pixels.…”
Section: Turbulence Affected Imagerymentioning
confidence: 99%
See 1 more Smart Citation
“…Tracking feature motion using event based sensors in the form of optical flow has been made even easier with newer sensors providing optical flow as a direct event output (e.g CeleX-V) or through processing tool chains. Event processing for optical flow and hence target identification and tracking has been described in the literature [6,7]. It is not hard to imagine the simple optical flow algorithms [9] being adapted to event streams from adjacent pixels.…”
Section: Turbulence Affected Imagerymentioning
confidence: 99%
“…The event based sensor suffers the same light and device noise issues as traditional sensors but readout noise is a different phenomenon and fill-factor has been an issue as each pixel has more integration. Many researchers have identified ways to characterise sensor performance in low light conditions [7,8]. The fill factor limitation is due to the level of integration required at each pixel site, and has been fixed recently in the latter case by back-illuminated devices.…”
Section: Introductionmentioning
confidence: 99%