2014 IEEE International Conference on Robotics and Automation (ICRA) 2014
DOI: 10.1109/icra.2014.6906882
|View full text |Cite
|
Sign up to set email alerts
|

Event-based 3D SLAM with a depth-augmented dynamic vision sensor

Abstract: Abstract-We present the D-eDVS-a combined event-based 3D sensor -and a novel event-based full-3D simultaneous localization and mapping algorithm which works exclusively with the sparse stream of visual data provided by the D-eDVS. The D-eDVS is a combination of the established PrimeSense RGB-D sensor and a biologically inspired embedded dynamic vision sensor. Dynamic vision sensors only react to dynamic contrast changes and output data in form of a sparse stream of events which represent individual pixel locat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
77
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 118 publications
(78 citation statements)
references
References 12 publications
1
77
0
Order By: Relevance
“…However, the system design was limited to planar motions (i.e., 3-DOF) and planar scenes parallel to the plane of motion consisting of artificial B&W line patterns. The method was extended to 3D in [8] but relied on an external RGB-D sensor attached to the event camera for depth estimation. The depth sensor introduces bottlenecks, which deprives the system of the low latency and high-speed advantages of event cameras.…”
Section: Contributionmentioning
confidence: 99%
“…However, the system design was limited to planar motions (i.e., 3-DOF) and planar scenes parallel to the plane of motion consisting of artificial B&W line patterns. The method was extended to 3D in [8] but relied on an external RGB-D sensor attached to the event camera for depth estimation. The depth sensor introduces bottlenecks, which deprives the system of the low latency and high-speed advantages of event cameras.…”
Section: Contributionmentioning
confidence: 99%
“…In the experiments, they used an upward-looking DVS mounted on a ground robot moving at low speed. The method was extended in [7] to a 3D SLAM system that requires an RGB-D sensor operating in parallel with the DVS.…”
Section: B Event-based Motion Estimationmentioning
confidence: 99%
“…These sensors constitute a paradigm shift since they operate asynchronously, transmitting only the information conveyed by brightness changes in the scene ("events"), at the time they occur with microsecond resolution. Event-driven algorithms have been developed to provide initial solutions to some robotics problems such as pose tracking [3], [4], visual odometry [5], Simultaneous Localization and Mapping (SLAM) [6], [7]. However, some of these approaches used additional sensors, such as depth sensors [5], [7], or were developed for high-contrast scenes [3], [4], [5].…”
Section: Introductionmentioning
confidence: 99%
“…As another example, in the context of SLAM, the DVS was combined with a frame-based RGB-D camera in [30]. The algorithm used a modified particle filter for tracking the current position and orientation of the sensor while at the same time incrementally creating a probabilistic voxel grid map of the previously unknown environment.…”
Section: Related Work: Ego-motion Estimation With Event-based VImentioning
confidence: 99%