2014 American Control Conference 2014
DOI: 10.1109/acc.2014.6858995
|View full text |Cite
|
Sign up to set email alerts
|

Combined laser and vision-aided inertial navigation for an indoor unmanned aerial vehicle

Abstract: As unmanned aerial vehicles are used in more environments, flexible navigation strategies are required to ensure safe and reliable operation. Operation in the presence of degraded or denied GPS signal is critical in many environments, particularly indoors, in urban canyons, and hostile areas. Two techniques, laser-based simultaneous localization and mapping (SLAM) and monocular visual SLAM, in conjunction with inertial navigation, have attracted considerable attention in the research community. This paper pres… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
13
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 18 publications
(13 citation statements)
references
References 14 publications
0
13
0
Order By: Relevance
“…Table I summarizes related research on indoor drone localization, with [17], [18] reviewed in [23] and [15], [16] covered in [24]. Our review shows a maximum of 3 different types of technologies for sensor fusion are considered, and only in a few cases is the impact of individual sensors investigated.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Table I summarizes related research on indoor drone localization, with [17], [18] reviewed in [23] and [15], [16] covered in [24]. Our review shows a maximum of 3 different types of technologies for sensor fusion are considered, and only in a few cases is the impact of individual sensors investigated.…”
Section: Related Workmentioning
confidence: 99%
“…13 cm 2D / 26 cm 3D 40 Hz no no [16] UWB, RGB-D camera ≤22 cm (x, y or z) not reported camera-only (≤46 cm) UWBonly (≤22 cm) no [17] IMU, laser SLAM, visual SLAM 30 cm 2D 10-25 Hz no no [18] IMU, RGB-D camera (Kinect) 8 cm x-axis, 2D/3D not reported 30 Hz no no real-life drone flights in an industrial lab environment. Fig.…”
mentioning
confidence: 99%
“…Lynen et al (2013) presented a general purpose multi-sensor fusion extended Kalman filter (MSF-EKF) that can handle different types of delayed measurement signals from multiple sensors, and provides a more accurate attitude estimation for UAV control and navigation. Magree and Johnson (2014) proposed an integrated navigation system, which combines both visual SLAM and laser SLAM with an EKF-based inertial navigation system. The monocular visual SLAM finds data association and estimates the state of UAVs, while the laser SLAM system performs scan-to-map matching under a Monte Carlo framework.…”
Section: Multi-sensor Fusionmentioning
confidence: 99%
“…In [61] a cooperative laser and visual SLAM approach for an UAV that depends solely on a laser, a camera and the inertial sensor has been proposed. The characteristic of the vision subsystem was the correlation of the detected features with the vehicle state and the fact that the detected point database was updated in every loop by an EKF.…”
Section: Visual Localization and Mappingmentioning
confidence: 99%