2015
DOI: 10.1155/2015/251379
|View full text |Cite
|
Sign up to set email alerts
|

Unmanned Aerial Vehicle Navigation Using Wide-Field Optical Flow and Inertial Sensors

Abstract: This paper offers a set of novel navigation techniques that rely on the use of inertial sensors and wide-field optical flow information. The aircraft ground velocity and attitude states are estimated with an Unscented Information Filter (UIF) and are evaluated with respect to two sets of experimental flight data collected from an Unmanned Aerial Vehicle (UAV). Two different formulations are proposed, a full state formulation including velocity and attitude and a simplified formulation which assumes that the la… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
8
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
5
2
1

Relationship

2
6

Authors

Journals

citations
Cited by 16 publications
(8 citation statements)
references
References 32 publications
0
8
0
Order By: Relevance
“…Each image is input to an arbitrary optical flow algorithm that estimates pixel shift trueα˙ and trueβ˙ between consecutive frames. For this configuration, it has been shown that given the focal length of the sensor optics f, the roll and pitch attitude of the camera ϕc and θc, the body frame angular rates p,q,r velocity components u,v,w, and range to observed texture ηz, then the optical flow estimate is given by the following equation (Chao et al, ; Rhudy, Gu, Chao, & Gross, ): true[centerα̇centerβ̇true]=f+βtanθnormalcαtanϕnormalccosθnormalcηztrue[centeru+αωfcenterv+βωftrue]true[centerfqrβpαv+qα2fcenterfp+rαpβ2+qαβftrue].…”
Section: Theorymentioning
confidence: 99%
“…Each image is input to an arbitrary optical flow algorithm that estimates pixel shift trueα˙ and trueβ˙ between consecutive frames. For this configuration, it has been shown that given the focal length of the sensor optics f, the roll and pitch attitude of the camera ϕc and θc, the body frame angular rates p,q,r velocity components u,v,w, and range to observed texture ηz, then the optical flow estimate is given by the following equation (Chao et al, ; Rhudy, Gu, Chao, & Gross, ): true[centerα̇centerβ̇true]=f+βtanθnormalcαtanϕnormalccosθnormalcηztrue[centeru+αωfcenterv+βωftrue]true[centerfqrβpαv+qα2fcenterfp+rαpβ2+qαβftrue].…”
Section: Theorymentioning
confidence: 99%
“…The Phastball SUAV airframe [43] was developed as a modular research platform and has been used for multiple sensorfusion studies [44][45][46]. The Phastball Zero SUAV is shown in Figure 3.…”
Section: Experimental Set-upmentioning
confidence: 99%
“…Finally, for the GPS-based heading and pitch estimates, 3°of uncertainty is assumed. Similar to the process-noise covariance matrix, these values make up a diagonal measurement-error covariance matrix, as shown in Equation (17).…”
Section: Nonlinear Kalman Filtermentioning
confidence: 99%
“…The experimental UAV used for this study is West Virginia University's (WVU's) Red Phastball Platform as shown in Figure 1, The Red Phastball is primarily used for sensor fusion research [17][18][19] and its avionics package [20] was updated for this study to include a Novatel OEM-615® dual-frequency GPS/Globalnaya Navigazionnaya Sputnikovaya Sistema (GLONASS) receiver, in which GPS pseudorange, carrier-phase and signal strength measurements were recorded at a rate of 10 Hz. In addition, for use in signal strength calibration and as pitch and roll reference solutions, the Red Phastball flew a Goodrich VG34® mechanical vertical gyroscope and the analog pitch and roll measurements were recorded using a micro-controller with a sampling rate of 50 Hz.…”
Section: Experimental Set-upmentioning
confidence: 99%