2017
DOI: 10.3389/fnins.2016.00594
|View full text |Cite
|
Sign up to set email alerts
|

A Motion-Based Feature for Event-Based Pattern Recognition

Abstract: This paper introduces an event-based luminance-free feature from the output of asynchronous event-based neuromorphic retinas. The feature consists in mapping the distribution of the optical flow along the contours of the moving objects in the visual scene into a matrix. Asynchronous event-based neuromorphic retinas are composed of autonomous pixels, each of them asynchronously generating “spiking” events that encode relative changes in pixels' illumination at high temporal resolutions. The optical flow is comp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
41
0
1

Year Published

2018
2018
2022
2022

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 53 publications
(43 citation statements)
references
References 87 publications
(126 reference statements)
1
41
0
1
Order By: Relevance
“…Recently, [12] introduced a feature descriptor based on local distributions of optical flow and applied it to corner detection and gesture recognition. It is inspired by its framebased counterpart [10], but in [12] the algorithm for computing the optical flow relies on the temporal information carried by the events. One limitation of [12] is that the quality of the descriptor strongly depends on the quality of the flow.…”
Section: Event-based Features and Object Classificationmentioning
confidence: 99%
See 1 more Smart Citation
“…Recently, [12] introduced a feature descriptor based on local distributions of optical flow and applied it to corner detection and gesture recognition. It is inspired by its framebased counterpart [10], but in [12] the algorithm for computing the optical flow relies on the temporal information carried by the events. One limitation of [12] is that the quality of the descriptor strongly depends on the quality of the flow.…”
Section: Event-based Features and Object Classificationmentioning
confidence: 99%
“…It is inspired by its framebased counterpart [10], but in [12] the algorithm for computing the optical flow relies on the temporal information carried by the events. One limitation of [12] is that the quality of the descriptor strongly depends on the quality of the flow. As a consequence, it loses accuracy in presence of noise or poorly contrasted edges.…”
Section: Event-based Features and Object Classificationmentioning
confidence: 99%
“…More complex data association schemes have been proposed, but mainly aimed to high-level tasks, such as pattern recognition [8,9,19,5], where prior training to a model is required. Therefore, these methods are not suitable for on-line corner-event tracking, where no a priori knowledge of the scene can be assumed.…”
Section: Previous Workmentioning
confidence: 99%
“…Feature detection and tracking with event cameras is a major research topic [8,9,[12][13][14][15][16][17][18], where the goal is to unlock the capabilities of event cameras and use Fig. 1(a): Comparison of the output of a standard frame-based camera and an event camera when facing a black dot on a rotating disk (figure adapted from [11]).…”
Section: Related Workmentioning
confidence: 99%
“…Recently, extensions of popular image-based keypoint detectors, such as Harris [19] and FAST [20], have been developed for event cameras [17,18]. Detectors based on the distribution of optical flow [21] for recognition applications have also been proposed for event cameras [16]. Finally, most event-based trackers use binary feature templates, either predefined [13] or built from a set of events [9], to which they align events by means of iterative point-set-based methods, such as iterative closest point (ICP) [22].…”
Section: (B)mentioning
confidence: 99%