2014
DOI: 10.5772/58429
|View full text |Cite
|
Sign up to set email alerts
|

Hovering by Gazing: A Novel Strategy for Implementing Saccadic Flight-Based Navigation in GPS-Denied Environments

Abstract: Hovering flies are able to stay still in place when hovering above flowers and burst into movement towards a new object of interest (a target). This suggests that sensorimotor control loops implemented onboard could be usefully mimicked for controlling Unmanned Aerial Vehicles (UAVs). In this study, the fundamental head-body movements occurring in free-flying insects was simulated in a sighted twin-engine robot with a mechanical decoupling inserted between its eye (or gaze) and its body. The robot based on thi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
4
1

Relationship

4
1

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 52 publications
0
6
0
Order By: Relevance
“…Documentation, tutorials, parts list and the whole software package are available at the X4-MaG project website: http://www.gipsa-lab.fr/projet/RT-MaG/. Future works will consist of implementing onboard the free-flying X4-MaG, minimalistic bioinspired strategies, such as those described in [23,24,25]. Such minimalist vision based strategies would be useful to make future insect-scale robots (e.g., [26]) relying on their own onboard sensors.…”
Section: Resultsmentioning
confidence: 99%
“…Documentation, tutorials, parts list and the whole software package are available at the X4-MaG project website: http://www.gipsa-lab.fr/projet/RT-MaG/. Future works will consist of implementing onboard the free-flying X4-MaG, minimalistic bioinspired strategies, such as those described in [23,24,25]. Such minimalist vision based strategies would be useful to make future insect-scale robots (e.g., [26]) relying on their own onboard sensors.…”
Section: Resultsmentioning
confidence: 99%
“…The retinal errors rφ and rθ are defined as the angular position of the target in the eye frame. It also can be seen as the angular error between the target position and the gaze direction, as defined in [25].…”
Section: B Control Strategy and State Estimation 1) Attitude And Posmentioning
confidence: 99%
“…The use of a gimbal eye can overcome this issue and allow the visual tracking task to be independent of the robot displacements (see the very recent eXom drone developed by SenseFly which is equipped with a gimbal "head" [23]). This work is part of our steering by gazing strategy ( [24], [25]) where the gaze orientation is directly used to solve the robot relative position with respect to the target, and let the possibility to the robot to realize a specific trajectory (turn around the target, stay at its vertical, stay at a certain distance, etc.) Section II describes the motivations of using a bio-inspired visual system and the related works.…”
Section: Introductionmentioning
confidence: 99%
“…In addition to the idea that the head orientation may serve as a zero reference for controlling body roll movements (which are therefore based on 'head in body' inputs only), as proposed by Boeddeker and Hemmi (2010), the existence of an internal 'body in horizon' estimation suggested by our improved model would mean that the body control process based on the 'head in body' is liable to compensate for some visual errors by taking a reference body orientation with respect to the horizon. Body control in insects may therefore be enhanced by using the relative positions of the visual system, as established theoretically (Manecy et al, 2014) and this may account for the near-perfect hovering performances observed in some flying insects, including the hoverfly. In addition, a command based on an internal body roll estimation with respect to the horizon would give an arbitrary head command without any conflict with the twonested feedback loop of the VPM (see Fig.…”
Section: Model Based On Vision and Proprioception (Vpm)mentioning
confidence: 99%