2018
DOI: 10.1017/s0263574718000577
|View full text |Cite
|
Sign up to set email alerts
|

Vision-only egomotion estimation in 6DOF using a sky compass

Abstract: SUMMARYA novel pure-vision egomotion estimation algorithm is presented, with extensions to Unmanned Aerial Systems (UAS) navigation through visual odometry. Our proposed method computes egomotion in two stages using panoramic images segmented into sky and ground regions. Rotations (in 3DOF) are estimated by using a customised algorithm to measure the motion of the sky image, which is affected only by the rotation of the aircraft, and not by its translation. The rotation estimate is then used to derotate the op… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 36 publications
0
5
0
Order By: Relevance
“…This finding will have to be confirmed in further experiments including outbound trajectories with various shapes and distances. At last, the relative mean homing error of 0.7% performed with AntBot is currently rather good in comparison with the current state-of-the-art in visual odometry, such as the 1.3% relative precision of a sky compass-based egomotion estimation system [73]. The different ways in which celestial cues are acquired could also be one of the reasons why the relative error was lower in the case of Sahabot 2 than AntBot.…”
Section: Discussionmentioning
confidence: 86%
“…This finding will have to be confirmed in further experiments including outbound trajectories with various shapes and distances. At last, the relative mean homing error of 0.7% performed with AntBot is currently rather good in comparison with the current state-of-the-art in visual odometry, such as the 1.3% relative precision of a sky compass-based egomotion estimation system [73]. The different ways in which celestial cues are acquired could also be one of the reasons why the relative error was lower in the case of Sahabot 2 than AntBot.…”
Section: Discussionmentioning
confidence: 86%
“…This OF method is to precisely estimate ∆Mt from the images ft-1 and ft. To do this, the ft-1 is shifted by two reference amounts ±∆xref along x axis to get two translation reference images fx±, and is shifted by two reference amounts ±∆yref along y axis to get two translation reference images fy±, and is rotated by two reference amounts ±∆rref about z axis to get two rotation reference images fr± respectively. From [7], the image-interpolation algorithm assumes that the image deformation would be continuous and linear along with camera motion. Hence, the image ft can be approximated by…”
Section: The Honeybee-vision Inspired Of Methodsmentioning
confidence: 99%
“…To estimate visual robot's motion and position, a honeybee-vision inspired OF measurement method is introduced into our relative position estimation that estimates global image motion with a simple and non-iterative implementation [7]. We consider a mobile robot translating along two axes (x, y) and rotating about the third axis (z).…”
Section: Relative Position Estimationmentioning
confidence: 99%
“…There have been past insect biology-inspired studies using celestial cues for navigation. The sky itself can be a reference in day time with sun, clouds and scattering [20], resulting in patterns that are stable [21]. Some implementations have been developed by using sky-polarised light [20] to achieve autonomous navigation both on the ground [22,23] and as part of a flying navigation system on a drone [24].…”
Section: Contribution Of This Studymentioning
confidence: 99%