2009
DOI: 10.1007/s10514-009-9140-0
|View full text |Cite|
|
Sign up to set email alerts
|

Implementation of wide-field integration of optic flow for autonomous quadrotor navigation

Abstract: Insects are capable of robust visual navigation in complex environments using efficient information extraction and processing approaches. This paper presents an implementation of insect inspired visual navigation that uses spatial decompositions of the instantaneous optic flow to extract local proximity information. The approach is demonstrated in a corridor environment on an autonomous quadrotor micro-air-vehicle (MAV) where all the sensing and processing, including altitude, attitude, and outer loop control … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
97
0
1

Year Published

2010
2010
2020
2020

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 146 publications
(98 citation statements)
references
References 38 publications
0
97
0
1
Order By: Relevance
“…However, so far this method has been validated only on tethered drones. An error correction method has also been demonstrated on a quadcopter equipped with omnidirectional vision to fly in corridors by continuously making corrections aimed at reducing the difference between measured optic flow and optic flow expected from flight at the desired altitude, attitude, speed, heading and distance from walls 63 . Quadcopter and flapping-wing drones are intrinsically unstable and must compensate for positional drift generated by noise in gyroscope and accelerometer signals to hover in place and maintain attitude.…”
Section: Review Insightmentioning
confidence: 99%
“…However, so far this method has been validated only on tethered drones. An error correction method has also been demonstrated on a quadcopter equipped with omnidirectional vision to fly in corridors by continuously making corrections aimed at reducing the difference between measured optic flow and optic flow expected from flight at the desired altitude, attitude, speed, heading and distance from walls 63 . Quadcopter and flapping-wing drones are intrinsically unstable and must compensate for positional drift generated by noise in gyroscope and accelerometer signals to hover in place and maintain attitude.…”
Section: Review Insightmentioning
confidence: 99%
“…Despite their limitations, it is because of these advantages that has led to the popularization and commercialization of quadcopters and therefore, their inexpensive nature [31]. As a result, hovercraft and quadcopters have seen the widest range of application including obstacles avoidance, odometry, and lateral, ventral, and forward OF control [17,26,32]. However, their complex dynamics make the control problem very difficult, and most demonstrations have been on single DoFs or in very constrained environments.…”
Section: B Insect Behavior and Cognitive Embodimentmentioning
confidence: 99%
“…The quadcopter design process proposed by Bouabdallah for sUAVs was used here and first chooses a propulsion group based on overall weight [32]. Quadcopters over 1kg are typically designed with a motor specification between 700-900Kv (and between 1300-2200Kv for less than 500g).…”
Section: B Beebot Platform Designmentioning
confidence: 99%
“…Based on the findings obtained at our Laboratory on the fly's visual sensory system [12], several versions of the 2-pixel Local Motion Sensor (LMS) [10,13,14,44,46] were developed, using an algorithm introduced by [5,39], which was later called the "time of travel scheme" (see [2,33]). Several visionbased systems have been previously designed to measure the optic flow onboard UAVs (Unmanned Aerial Vehicles) [8,19,22] and in particular in the range expe-rienced during lunar landing [20,28,52]. Most of these visual systems were quite demanding in terms of their computational requirements and/or their weight or were not very well characterized, except for the optical mouse sensors [4], with which a standard error of approximately ±5 • /s around 25 • /s was obtained in the case of an optical mouse sensor measuring motion in a ±280 • /s overall range.…”
Section: Introductionmentioning
confidence: 99%