This demonstration shows how an inexpensive high frame-rate USB camera is used to emulate existing and proposed activity-driven event-based vision sensors. A PS3-Eye camera which runs at a maximum of 125 frames/second with colour QVGA (320×240) resolution is used to emulate several event-based vision sensors, including a Dynamic Vision Sensor (DVS), a colour-change sensitive DVS (cDVS), and a hybrid vision sensor with DVS+cDVS pixels. The emulator is integrated into the jAER software project for event-based real-time vision and is used to study use cases for future vision sensor designs.
I. DEMONSTRATIONRecent developments in activity-driven, event-based vision sensors have opened up a promising alternative to conventional frame-based vision. By outputting a sparse and variable-rate stream of data originating asynchronously from pixels with local gain control, these sensors reduce processing cost and latency, while increasing dynamic range. Except for [1], these vision sensor developments have been preceded by silicon chip developments. We thought that it would be useful to develop a behavioural emulator of such chip architectures that would allow us or others to study how to use a proposed vision sensor in application scenarios, and to allow us to model, in software, non-idealities such as fixed pattern noise and temporal noise. In particular, we needed an emulator that is fully integrated into our host-side software infrastructure [3]. In this demonstration, we show how this emulation allows us to model several existing and proposed vision sensor architectures.
II. DEMONSTRATION SETUPThe demonstration setup consists of a laptop, a webcam (Playstation PS3-Eye), and a DVS128 Dynamic Vision Sensor [4]. A large monitor or beamer shows the output.
III. VISITOR EXPERIENCEVisitors see how the emulator models an existing DVS128, how the improved resolution of the QVGA emulation will benefit applications in wide-area surveillance, and how the emulation models the cDVS sensor we are developing to simultaneously detect colour change and brightness change [5] . By using a high speed rotating stimulus, visitors also see the limitations of the conventional camera in terms of time resolution (8ms for the PS3-Eye compared with 1us for the DVS128), and in terms of dynamic range (<50dB for the PS3-Eye, compared with 120dB for the DVS128).
IV. DEMONSTRATION READINESSThe demonstration is fully functional (see Fig. 1). We can simultaneously run the emulation and a real event-based sensor, record data, and play it back. We can display data in a variety of formats such as the spatial histogram shown in Vision sensor emulation demonstration. The output of jAER simultaneously shows the emulated DVS+cDVS and the raw PS3-Eye camera. Left window shows output of jAER [3]. The emulated pixels respond both to brightness change (black and white pixels) and mean wavelength change (red and blue pixels). The right window shows the control panel which allows camera and emulation parameter control, and also shows the raw PS3-Eye came...