2012
DOI: 10.1177/0278364912455256
|View full text |Cite
|
Sign up to set email alerts
|

Estimation, planning, and mapping for autonomous flight using an RGB-D camera in GPS-denied environments

Abstract: RGB-D cameras provide both color images and per-pixel depth estimates. The richness of this data and the recent development of low-cost sensors have combined to present an attractive opportunity for mobile robotics research. In this paper, we describe a system for visual odometry and mapping using an RGB-D camera, and its application to autonomous flight. By leveraging results from recent stateof-the-art algorithms and hardware, our system enables 3D flight in cluttered environments using only onboard sensor d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
102
0

Year Published

2013
2013
2017
2017

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 151 publications
(102 citation statements)
references
References 68 publications
0
102
0
Order By: Relevance
“…Research groups on robotics have proposed different techniques for obstacle avoidance, based on sensors like LiDAR [4][5][6] and RGB-D [7,8], which show robustness to identifying obstacles; but implementing these devices in a compact MAV is difficult, expensive, and also, these consume additional electrical power. When we work with UAVs and want to implement other device onboard, we need to consider the payload that they can carry, limiting the use of any UAV.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Research groups on robotics have proposed different techniques for obstacle avoidance, based on sensors like LiDAR [4][5][6] and RGB-D [7,8], which show robustness to identifying obstacles; but implementing these devices in a compact MAV is difficult, expensive, and also, these consume additional electrical power. When we work with UAVs and want to implement other device onboard, we need to consider the payload that they can carry, limiting the use of any UAV.…”
Section: Related Workmentioning
confidence: 99%
“…Based on the low-level control system of Bebop, the motion in the y-axis can be controlled by the roll [48,49]. It is necessary to estimate the mathematical model that relates roll control with the linear speed in the y-axis; the transfer function of the mathematical model is shown in Equation (8). This model is simulated and presented in Figure 8 in order to observe the step response.…”
Section: System Identificationmentioning
confidence: 99%
“…This platform was used mostly to perform autonomous exploration and navigation tasks such as the work from Shen et al [22,23], Weiss et al [24], Bachrach et al [4,6,5] and Pravitra et al [21]. The platform offers high performance on-board processors with around 600-650 grams payload.…”
Section: Related Workmentioning
confidence: 99%
“…Du et al (2011) followed a similar approach but allowed user interaction. The RGB-D SLAM method (Engelhard et al, 2011;Endres et al, 2012) and the method of Bachrach et al, (2012) were both based on the idea of initial registration using visual feature points, although they used different feature extraction operators. Dryanovski et al (2012) performed the initial registration based on edge features extracted from the colour images.…”
Section: Related Workmentioning
confidence: 99%