Automatic flying target detection and tracking in video sequences acquired from a camera mounted on another Unmanned Aerial Vehicle (UAV) is a challenging task due to the presence of nonstationary cameras in the system, dynamic motion of the moving target, and high-cost computation for realtime applications. In this paper, our aim is to automatically detect and track moving UAV by another one while simultaneously flying in the air. In order to provide efficiently in real-time applications, we develop a vision-based low-cost hardware system integrated with an independent ground control station. We initially created a new public dataset called ATAUAV that includes different types of UAV images obtained from videos recording in our experiments and searches on Google Images for the training process. Deep learningbased YOLOv3-Tiny (You Only Look Once) is used for target detection with the highest accuracy and fastest results. Kernelized Correlation Filter (KCF) adapted with YOLO, which runs on low-cost hardware, is used for real-time detected target tracking. We compared the performance of the proposed approach with different tracking algorithms. Experimental results show that the proposed approach provides the highest accuracy rate as 82.7% and a mean fps speed as 29.6 on CPU. The dataset can be downloaded at http://cogvi.atauni.edu.tr/ResearchLab/PageDetail/Our-ATAUAVs-Dataset-86. INDEX TERMS Artificial neural networks, computer vision, kcf, object detection, object recognition, target tracking, unmanned aerial vehicles, yolo This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.