2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) 2021
DOI: 10.1109/cvprw53098.2021.00151
|View full text |Cite
|
Sign up to set email alerts
|

Comparing Representations in Tracking for Event Camera-based SLAM

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
26
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
3
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 25 publications
(26 citation statements)
references
References 26 publications
0
26
0
Order By: Relevance
“…In recent years, event camera has gained more attention increasingly. It has been applied to many computer vision tasks, including object recognition [47,52], segmentation [14], corner detection [59,79], gesture recognition [23,33,78], optical flow estimation [22,54], depth estimation [32,37], Simultaneous Localization And Mapping (SLAM) [19,43], and autonomous driving [25,51]. While RGB cameras struggle due to motion blur, event camera, by design, is highly sensitive to lux variation in both extremely overexposed and underexposed scene [18].…”
Section: Related Workmentioning
confidence: 99%
“…In recent years, event camera has gained more attention increasingly. It has been applied to many computer vision tasks, including object recognition [47,52], segmentation [14], corner detection [59,79], gesture recognition [23,33,78], optical flow estimation [22,54], depth estimation [32,37], Simultaneous Localization And Mapping (SLAM) [19,43], and autonomous driving [25,51]. While RGB cameras struggle due to motion blur, event camera, by design, is highly sensitive to lux variation in both extremely overexposed and underexposed scene [18].…”
Section: Related Workmentioning
confidence: 99%
“…Although event cameras can outperform standard visual sensors in severe illumination and dynamic range conditions, they mainly generate unsynchronized information about the environment. This makes traditional vision algorithms unable to process the outputs of these sensors [ 41 ]. Additionally, using the spatio-temporal windows of events along with the data obtained from other sensors can provide rich pose estimation and tracking information.…”
Section: Vslam Setupmentioning
confidence: 99%
“… We propose a semi-joint event-inertial initialization method to estimate initialization parameters, including scale, gravity direction, initial velocities, accelerometer, and gyroscope biases, in two steps (event-only initialization and event-inertial initial optimization). We implement the event-based stereo VIO system, ESVIO, in C++ and evaluate it on four public datasets [ 25 , 26 , 27 , 28 ]. The results demonstrate that our system achieves good accuracy and robust performance when compared with the state-of-the-art, and, at the same time, with no compromise to real-time performance.…”
Section: Introductionmentioning
confidence: 99%