2020 25th International Conference on Pattern Recognition (ICPR) 2021
DOI: 10.1109/icpr48806.2021.9413241
|View full text |Cite
|
Sign up to set email alerts
|

Real-Time Drone Detection and Tracking With Visible, Thermal and Acoustic Sensors

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
45
0
3

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 90 publications
(48 citation statements)
references
References 28 publications
0
45
0
3
Order By: Relevance
“…Both audio and video streams are then processed to extract features which are then fed to a classifier to perform the detection. The results show the audio-assisted system (95.74%) significantly improves the performance of the visual detection [130] develop a drone detection and tracking system with sensor fusion extension to perform sophisticated detection decision based on outputs of multiple sensors (acoustic sensors, thermal cameras, and daylight cameras) with the possibility to configure sensors selection and weighted outputs of the selected sensors. However, with widely available settings of the sensors selection and fusion, the accuracy performance are not evaluated for hybrid detection.…”
Section: Sensor Fusionmentioning
confidence: 98%
See 1 more Smart Citation
“…Both audio and video streams are then processed to extract features which are then fed to a classifier to perform the detection. The results show the audio-assisted system (95.74%) significantly improves the performance of the visual detection [130] develop a drone detection and tracking system with sensor fusion extension to perform sophisticated detection decision based on outputs of multiple sensors (acoustic sensors, thermal cameras, and daylight cameras) with the possibility to configure sensors selection and weighted outputs of the selected sensors. However, with widely available settings of the sensors selection and fusion, the accuracy performance are not evaluated for hybrid detection.…”
Section: Sensor Fusionmentioning
confidence: 98%
“…However, with widely available settings of the sensors selection and fusion, the accuracy performance are not evaluated for hybrid detection. Similar to [130], [135], authors in [136] and [137] propose fusion of visual and radar detection. A Scanning Surveillance Radar System (SSRS) aided with camera is used to detect a drone using radar whereas it its identification is further confirmed by the camera.…”
Section: Sensor Fusionmentioning
confidence: 99%
“…As noted in [32], there is little publicly available information on this topic. Recently, a public benchmark was proposed in [37] for the assessment of the efficiency of detection means. Reference [38] provides some figures for tested mitigation means.…”
Section: Data About the Efficiency Of Technical Meansmentioning
confidence: 99%
“…The applied sensors for drone surveillance: camera (mono, RGB [ 8 , 9 , 10 , 11 , 12 , 13 , 14 ], multi-, hyperspectral, short-/longwave infrared [ 14 ]), radar [ 15 ], radio direction finder [ 16 , 17 , 18 ], acoustic (single [ 14 , 19 , 20 ], array, matrix) and laser detection and ranging. Various fusion techniques [ 14 , 21 , 22 ] have also appeared, mainly with visual, infrared, and acoustic sensors. In the case of ground transit, the use of geophones is also common [ 23 , 24 ].…”
Section: Related Literaturementioning
confidence: 99%