AIAA Guidance, Navigation, and Control Conference 2009
DOI: 10.2514/6.2009-5680
|View full text |Cite
|
Sign up to set email alerts
|

Vision-Based Precision Landings of a Tailsitter UAV

Abstract: We present a method of performing precision landings of a vertical takeoff and landing (VTOL) unmanned air vehicle (UAV) with the use of an onboard vision sensor and information about the aircraft's orientation and altitude above ground level (AGL). A method for calculating the 3-dimensional location of the UAV relative to a ground target of interest is presented as well as a navigational controller to position the UAV above the target. A method is also presented to prevent the UAV from moving in a way that wi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2012
2012
2022
2022

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 14 publications
0
3
0
Order By: Relevance
“…In addition, more sensors need to be added to the aircraft to achieve the navigation of autonomous flight. When incorporating GPS navigation and vision-based navigation [8], UAV will be controlled automatically taking off and landing in the urban environment, and complete the task of trajectory tracking and indoor vertical flight.…”
Section: Discussionmentioning
confidence: 99%
“…In addition, more sensors need to be added to the aircraft to achieve the navigation of autonomous flight. When incorporating GPS navigation and vision-based navigation [8], UAV will be controlled automatically taking off and landing in the urban environment, and complete the task of trajectory tracking and indoor vertical flight.…”
Section: Discussionmentioning
confidence: 99%
“…For instance, Saripalli et al [12] and Merz et al [13] have both successfully landed a helicopter detecting and tracking a predefined landmark using vision. Some of the contributions have targeted controlled landing using estimations from images [14], and particularly optic flow [15], however the landing site is arbitrarily selected.…”
Section: Introductionmentioning
confidence: 99%
“…The authors in [21] use video from the onboard camera and the orientation and altitude of the ground vehicle to calculate the UAV position with respect to (w.r.t.) the target.…”
Section: Introductionmentioning
confidence: 99%