2019 Integrated Communications, Navigation and Surveillance Conference (ICNS) 2019
DOI: 10.1109/icnsurv.2019.8735240
|View full text |Cite
|
Sign up to set email alerts
|

Detection, Tracking and Classification of Aircraft and Drones in Digital Towers Using Machine Learning on Motion Patterns

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
17
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 33 publications
(18 citation statements)
references
References 11 publications
0
17
0
Order By: Relevance
“…Seo et al [18] used a STFT method to transform the UAV sound signal into a spectrogram and used CNN to perform a classification task. Thai et al [27] used camera to capture the flight video of the UAVs, and employed optical flow to localize and track the flight trajectory of the UAV, through Harris detection and CNN, and finally applied k-nearest neighbor (KNN) for UAV classification. As a departure from the optical-based approaches, Rozantsev et al [28] stacked the motion windows of UAVs on several consecutive frames and combined them with a regression motion stabilization algorithm to achieve UAV track in low light or almost invisible conditions.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Seo et al [18] used a STFT method to transform the UAV sound signal into a spectrogram and used CNN to perform a classification task. Thai et al [27] used camera to capture the flight video of the UAVs, and employed optical flow to localize and track the flight trajectory of the UAV, through Harris detection and CNN, and finally applied k-nearest neighbor (KNN) for UAV classification. As a departure from the optical-based approaches, Rozantsev et al [28] stacked the motion windows of UAVs on several consecutive frames and combined them with a regression motion stabilization algorithm to achieve UAV track in low light or almost invisible conditions.…”
Section: Related Workmentioning
confidence: 99%
“…• T. Li, Z. Hong, Q. Cai, L.Yu, Z. Wen [25], [26]), or through camera-based target tracking from video streaming [27], [28] or statistically monitoring network traffic data [29], [30]. Acoustic-based approaches are typically sensitive to environmental noises whilst the visual quality of camera is subject to the surrounding conditions such as building blockage, ambient lighting, etc.…”
Section: Introductionmentioning
confidence: 99%
“…In work [12] the authors trained a K-Nearest Neighbour (KNN) classifier to differentiate between drones and aircrafts. They used x, y coordinates extracted from videos of installations at an airport.…”
Section: Related Workmentioning
confidence: 99%
“…Acoustic sensors do not require line-of-sight (LOS); however, they suffer from short range, as drones could operate very quietly [12,28], and data gathered using microphone systems are prone to wind and environmental clutter. On the other hand, a LOS vision under daylight is essential for techniques that utilize camera images [14,29]. Using thermal or laser-based cameras to overcome this issue increases the cost signi icantly.…”
Section: Literature Review and Contribu-tionsmentioning
confidence: 99%