2019 12th International Conference on Developments in eSystems Engineering (DeSE) 2019
DOI: 10.1109/dese.2019.00093
|View full text |Cite
|
Sign up to set email alerts
|

Implementation of Autonomous Visual Detection, Tracking and Landing for AR.Drone 2.0 Quadcopter

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
9
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 21 publications
(12 citation statements)
references
References 20 publications
0
9
0
Order By: Relevance
“…The Computer Vision (CV) system plays an important role in a majority of outdoor landing systems since real-time target tracking is required. Color-based detection algorithms [15] or visually distinctive tagging [16] have been applied to recognize a moving platform. Several works merges GPS and CV approaches, e.g., Feng et al [17] observed the landing platform by the camera at a close distance and relied on GPS data otherwise.…”
Section: Related Workmentioning
confidence: 99%
“…The Computer Vision (CV) system plays an important role in a majority of outdoor landing systems since real-time target tracking is required. Color-based detection algorithms [15] or visually distinctive tagging [16] have been applied to recognize a moving platform. Several works merges GPS and CV approaches, e.g., Feng et al [17] observed the landing platform by the camera at a close distance and relied on GPS data otherwise.…”
Section: Related Workmentioning
confidence: 99%
“…Computer vision has several implementations in robotics. For instance, Respall et al [2] applied it in quadcopters. This project was focused on tracking an unmanned ground vehicle (UGV) platform by getting color information of the image.…”
Section: Related Workmentioning
confidence: 99%
“…They can be programmed using a multi-platform application called Choregraphe or programming languages such as C++ or Python. Furthermore, the robots have a wide variety of sensors, including seven touch sensors, omnidirectional microphones, ultrasonic sensors, and 2D cameras that allow them to interact with their surroundings 2 . NAO robots are now in their 6 th version and have become a staple in education and research alike.…”
Section: Introductionmentioning
confidence: 99%
“…This mode of operation needs considerable logistics, making autonomous landing on a moving platform a better choice for many applications. Aiming at this technology, the works [7][8][9][10] achieve the estimation of the relative position and attitude between quadrotor and platform, and navigation using vision, while the control is carried out using a traditional proportional-integral-derivative (PID) algorithm. However, the quadrotor may be disturbed by ground effects and winds during the landing process.…”
Section: Introductionmentioning
confidence: 99%
“…m/s from its initial position[50,7,2] T m. The three-dimensional (3D) landing trajectory of the quadrotor is shown in Figure5, where the black curves belong to the trajectory cluster planned by MPC during the whole process. It can be seen that the landing accuracy gradually improves as the quadrotor approaches the target.…”
mentioning
confidence: 99%