2013
DOI: 10.1007/978-3-642-39402-7_14
|View full text |Cite
|
Sign up to set email alerts
|

Simultaneous Localization and Mapping for Event-Based Vision Systems

Abstract: Abstract. We propose a novel method for vision based simultaneous localization and mapping (vSLAM) using a biologically inspired vision sensor that mimics the human retina. The sensor consists of a 128x128 array of asynchronously operating pixels, which independently emit events upon a temporal illumination change. Such a representation generates small amounts of data with high temporal precision; however, most classic computer vision algorithms need to be reworked as they require full RGB(-D) images at fixed … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
90
0

Year Published

2014
2014
2024
2024

Publication Types

Select...
3
3
2

Relationship

1
7

Authors

Journals

citations
Cited by 101 publications
(90 citation statements)
references
References 9 publications
0
90
0
Order By: Relevance
“…Localization using a DVS on a ground robot was first presented in [14] and later extended to Simultaneous Localization And Mapping (SLAM) in [15]. However, the system was limited to planar motion and a 2D map.…”
Section: B Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Localization using a DVS on a ground robot was first presented in [14] and later extended to Simultaneous Localization And Mapping (SLAM) in [15]. However, the system was limited to planar motion and a 2D map.…”
Section: B Related Workmentioning
confidence: 99%
“…We used a probabilistic framework that updates the pose likelihood relative to the previous CMOS frame by processing each event individually as soon as it arrives. As in [15], the experiments were performed at relatively low speeds (up to 30…”
Section: B Related Workmentioning
confidence: 99%
“…A particle-filter approach for robot self-localization using the DVS was introduced in [28] and later extended to SLAM in [29]. However, the system was limited to planar motions and 2-D maps.…”
Section: Related Work: Ego-motion Estimation With Event-based VImentioning
confidence: 99%
“…Eventbased adaptations of iterative closest points [24] and optical flow [5] have already been proposed. Recently, event-based visual odometry [9,17], tracking [28,23], and Simultaneous Localization And Mapping (SLAM) [29] algorithms have also been presented. The design goal of such algorithms is that each incoming event can asynchronously change the estimated state of the system, thus, preserving the event-based nature of the sensor and allowing the design of highly-reactive systems, such as pen balancing [10] or particle tracking in fluids [12].…”
Section: Introductionmentioning
confidence: 99%
“…Additionally the resampling operation is computationally expensive when executed several thousand times per second. For these reasons an incremental model is chosen where the Markov assumption is released and the score of particles does not only depend on the current measurement but also on the recent past of measurements [12], [13]. For each new event e k particle scores are updated using an exponential decay model:…”
Section: Event-based 3d Slammentioning
confidence: 99%