2016
DOI: 10.1007/978-3-319-42417-0_24
|View full text |Cite
|
Sign up to set email alerts
|

Insect-Inspired Visual Navigation for Flying Robots

Abstract: Abstract. This paper discusses the implementation of insect-inspired visual navigation strategies in flying robots, in particular focusing on the impact of changing height. We start by assessing the information available at different heights for visual homing in natural environments, comparing results from an open environment against one where trees and bushes are closer to the camera. We then test a route following algorithm using a gantry robot and show that a robot would be able to successfully navigate a r… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
11
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
2
2
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 9 publications
(12 citation statements)
references
References 18 publications
1
11
0
Order By: Relevance
“…To demonstrate the feasibility of our biomimetic visual navigation algorithm, we showed that route navigation can function effectively despite differences in the height of the training route compared to the route being navigated, whether images are simply stored or used to train an ANN to generate a compact route encoding. This reinforces the work of (Zeil et al, 2003;Philippides et al, 2016;Murray and Zeil, 2017) which showed that the utility of single images for visual homing, increases with increasing height. In addition, we showed that visual information from single images can transfer between wheeled and flying robots, despite changes in tilt during flight and a forward facing camera.…”
Section: Discussionsupporting
confidence: 86%
See 2 more Smart Citations
“…To demonstrate the feasibility of our biomimetic visual navigation algorithm, we showed that route navigation can function effectively despite differences in the height of the training route compared to the route being navigated, whether images are simply stored or used to train an ANN to generate a compact route encoding. This reinforces the work of (Zeil et al, 2003;Philippides et al, 2016;Murray and Zeil, 2017) which showed that the utility of single images for visual homing, increases with increasing height. In addition, we showed that visual information from single images can transfer between wheeled and flying robots, despite changes in tilt during flight and a forward facing camera.…”
Section: Discussionsupporting
confidence: 86%
“…Because of visual aliasing, there is little success in place recognition when routes and goals are at different heights (as judged by comparing images and seeing if the best matching image is at the same location, data not shown). However, in (Philippides et al, 2016) we saw that when height varies, the region over which an image can be used to recover directional information is larger than the region over which place recognition is possible. This previous work compared images from 40 cm to 2 m so we wanted to assess if images from a UAV could be used to guide a ground-based robot.…”
Section: Can Images From a Quadcopter Be Used By A Ground-based Robot To Recover A Heading?mentioning
confidence: 95%
See 1 more Smart Citation
“…The neuronal mechanisms underlying visual landmark navigation are not yet known, although a variety of models for navigation on different spatial ranges have been proposed on the basis of behavioral experiments, anatomical evidence, functional imaging, and electrophysiological recordings (Philippides et al 2016;Hoinville and Wehner 2018;Stone et al 2017;Webb and Wystrach 2016;Schulte et al 2019;Baddeley et al 2012;Honkanen et al 2019). At the level of motionsensitive wide-field neurons, the spatial landmark constellation that guides bees to their goal could be shown to lead to a characteristic time-dependent response profile during the intersaccadic intervals of navigation flights providing unique information about the vicinity of the goal (Mertes et al 2014;Egelhaaf et al 2014).…”
Section: • Active Visual Strategies During Localmentioning
confidence: 99%
“…Here it is demonstrated that helical flight paths could be used to scan in the vertical and horizontal dimensions simultaneously. The authors in [47] use a robotic gantry to demonstrate that visual homing can be tolerant to height offsets between the inbound and outbound routes without the use of special flight paths. In the above studies a forwards facing camera system has been adopted.…”
Section: Introductionmentioning
confidence: 99%