2019 American Control Conference (ACC) 2019
DOI: 10.23919/acc.2019.8815101
|View full text |Cite
|
Sign up to set email alerts
|

Estimation and Tracking of a Moving Target by Unmanned Aerial Vehicles

Abstract: An image-based control strategy along with estimation of target motion is developed to track dynamic targets without motion constraints. To the best of our knowledge, this is the first work that utilizes a bounding box as image features for tracking control and estimation of dynamic target without motion constraint. The features generated from a You-Only-Look-Once (YOLO) deep neural network can relax the assumption of continuous availability of the feature points in most literature and minimize the gap for app… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
10
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(10 citation statements)
references
References 30 publications
0
10
0
Order By: Relevance
“…Although non-cooperative vision-based localization shows promising results in indoor and outdoor environments, it fails to determine the position information when a direct line of sight is absent. Also, the target is required to continuously remain in the field of view of the UAV on-board sensors during the entire mission for the UAV to be able to determine its positional information [19]. To avoid the drawbacks of GPS-based localization and vision-based non-cooperative localization, hybrid systems are proposed; [20] and [21] present a localization and tracking system using GPS and visual information where one system can be used if the other system fails.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Although non-cooperative vision-based localization shows promising results in indoor and outdoor environments, it fails to determine the position information when a direct line of sight is absent. Also, the target is required to continuously remain in the field of view of the UAV on-board sensors during the entire mission for the UAV to be able to determine its positional information [19]. To avoid the drawbacks of GPS-based localization and vision-based non-cooperative localization, hybrid systems are proposed; [20] and [21] present a localization and tracking system using GPS and visual information where one system can be used if the other system fails.…”
Section: Related Workmentioning
confidence: 99%
“…k p4 and k p5 are the gain factors that control the maximum force generated from (19) and (20) respectively. The positive constants c 5 and c 6 are less than 1.0; therefore, the UAV will receive the vertical directional force before the repulsive force in the XY plane.…”
Section: B D-apf Repulsive Forcementioning
confidence: 99%
“…Many types of research on the UKF method are in progress, such as comparing different filtering methods and sensor fusion [4]. The UKF is presently used in a wide range of applications, from target tracking [31]. Some other types of data fusion studies aimed at improving the performance of mobile robots [32].…”
Section: Introductionmentioning
confidence: 99%
“…To further improve the accuracy of the motion information of robots, many filtering approaches exist and continue to be developed in the literature. Many applications are using the unscented Kalman filter (UKF) in various domains nowadays, ranging from target tracking [15] to multi-sensor fusion [16,17]. Another form of sensor fusion research to improve the performance of existing mobile robots is found in [18], where two methods (Dempster-Shafer theory and Kalman filtering) are used to integrate a global positioning system (GPS) and an inertial measurement unit (IMU), and the obtained results allowed for selecting the most accurate method for robot localization at an appropriate cost.…”
Section: Introductionmentioning
confidence: 99%