2022 IEEE Radar Conference (RadarConf22) 2022
DOI: 10.1109/radarconf2248738.2022.9764172
|View full text |Cite
|
Sign up to set email alerts
|

Convolutional Neural Networks for Robust Classification of Drones

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
2

Relationship

2
4

Authors

Journals

citations
Cited by 12 publications
(6 citation statements)
references
References 17 publications
0
6
0
Order By: Relevance
“…Previous work has reported the impact of SNR on classification performance [11]. Data from the UoB testbed are progressing the work on developing robust classifiers for challenging realistic conditions and an example of this is detailed in Table 3 from recently published results using Convolutional Neural Networks (CNN) for classifying drones and birds [12].…”
Section: Processing Capabilitiesmentioning
confidence: 99%
See 1 more Smart Citation
“…Previous work has reported the impact of SNR on classification performance [11]. Data from the UoB testbed are progressing the work on developing robust classifiers for challenging realistic conditions and an example of this is detailed in Table 3 from recently published results using Convolutional Neural Networks (CNN) for classifying drones and birds [12].…”
Section: Processing Capabilitiesmentioning
confidence: 99%
“…The vertical axis is radial velocity in m/s and the colour scale is signal power in dB.T A B L E 3 Percentage accuracies recorded with CNN trained and tested on drone and bird spectrograms taken from rural and urban locations. In total analysis was performed on 7623 spectrograms of drones and birds with a 60:30:10 split between training, test and validation phases[12]. Spectrogram of I3-D drone from (a) real data and (b) synthetic data generated using truth data from the real flight.…”
mentioning
confidence: 99%
“…Drones are known to produce micro-Doppler shifts in radar echoes due to the modulation of incident radar waves by their rotating blades [13]. These shifts are unique radar signatures that can aid in the detection and classification of drones from other clutter, such as birds and humans [14][15][16][17][18][19][20]. Drones can be classified into three types based on the relative direction between the rotating plane of blades and the ground plane.…”
Section: Introductionmentioning
confidence: 99%
“…This is by providing the means to consistently track the presence of any micro-Doppler components in the radar signature of drones (see Section 1.2). Micro-Doppler, which is a key target classification cue [20][21][22][23][24][25][26][27][28][29][30][31][32], is represented by spectral lines in Doppler spectrograms. They have a harmonic structure and originate from the motion of rotors on-board the UAS.…”
mentioning
confidence: 99%
“…For ATR, we use the machine-learning-based Multi-stage Decision Tree (MDT) in [28] that utilises kinematic and micro-Doppler features to discriminate between drone and non-drone targets. When tested on real representative data, it has a reasonable ATR accuracy compared with more advanced micro-Doppler classifiers, for example, neural-networks-based ones [21][22][23][24][25][26][27]. Below and unlike [32], we retrain the MDT model on MTT with PoT data to better illustrate how the consistent tracking of mUAS micro-Doppler components benefits ATR.…”
mentioning
confidence: 99%