2018
DOI: 10.1109/lra.2018.2793357
|View full text |Cite
|
Sign up to set email alerts
|

Ultimate SLAM? Combining Events, Images, and IMU for Robust Visual SLAM in HDR and High-Speed Scenarios

Abstract: Abstract-Event cameras are bio-inspired vision sensors that output pixel-level brightness changes instead of standard intensity frames. These cameras do not suffer from motion blur and have a very high dynamic range, which enables them to provide reliable visual information during high speed motions or in scenes characterized by high dynamic range. However, event cameras output only little information when the amount of motion is limited, such as in the case of almost still motion. Conversely, standard cameras… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
275
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 392 publications
(275 citation statements)
references
References 31 publications
(63 reference statements)
0
275
0
Order By: Relevance
“…As a result, automatic exposure control is important. Improvements can be made by adapting exposure control to the expected brightness level of the observed landing target, or by using a different technology like an event‐based camera which has been shown to be less susceptible to exposure effects (Vidal, Rebecq, Horstschaefer, & Scaramuzza, ).…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…As a result, automatic exposure control is important. Improvements can be made by adapting exposure control to the expected brightness level of the observed landing target, or by using a different technology like an event‐based camera which has been shown to be less susceptible to exposure effects (Vidal, Rebecq, Horstschaefer, & Scaramuzza, ).…”
Section: Resultsmentioning
confidence: 99%
“…Figure 26 shows an example mission data product monitoring the surface temperature of a rock formation at different times of day. (Vidal, Rebecq, Horstschaefer, & Scaramuzza, 2018).…”
Section: Long-duration Autonomy and Rechargingmentioning
confidence: 99%
“…The next leap in visual IMU fusion came with the introduction of tight coupling of the frame‐based camera, event‐based camera, and the IMU (Vidal et al, ) for pose estimation to leverage the complementary advantages of frame‐based cameras and event‐based cameras. This extended the work of Rebecq et al () to include optical cameras as well for fusion.…”
Section: Event‐based Vision Algorithmsmentioning
confidence: 99%
“…Window size becomes a critical parameter Minimum mean position error of 0.54% and yaw error of 0.03 /m (Vidal et al, 2018) Improved performance by combining standard frames with IMU and event data…”
Section: Slammentioning
confidence: 99%
“…The system estimates 6-DOF motion of camera in natural environment by tracking sparse features in event streams. This is the first event based visual odometry Rosinol et al 2018 [16] 3D natural X √ fusion with grayscale camera and IMU system that uses sparse feature points, but it still requires intensity images from conventional camera to detect feature points at first. Compared with the visual odometry system completely based on conventional camera, this system avoids a tremendous computational burden of feature tracking.…”
Section: Related Workmentioning
confidence: 99%