Detecting and classifying the modulation type of the intercepted noisy LPI (low probability of intercept) radar signals in real-time is a necessary survival technique in the electronic intelligence systems. Most radar signals have been designed to have LPI properties; therefore, the LPI radar waveform recognition technique (LWRT) has recently gained increasing attention. In this paper, we propose a multiple feature images joint decision (MFIJD) model with two different feature extraction structures that fully extract the pixel feature to obtain the pre-classification results of each feature image for the non-stationary characteristics of most LPI radar signals. The core technology of this model is combining the short-time autocorrelation feature image, double short-time autocorrelation feature image and the original signal time-frequency image (TFI) simultaneously input into the hybrid model classifier, which is suitable for non-stationary signals, and it has higher universality. We demonstrate the performance of MFIJD by simulating 11 types of the signals defined in this paper and generating training sets and test sets. The comparison with the literature shows that the proposed methods not only has a high universality for LPI radar signals, but also better adapts to LPI radar waveform recognition at low SNR (signal to noise ratio) environment. The overall recognition rate of the method reaches 87.7% when the SNR is −6 dB.Sensors 2020, 20, 526 2 of 23 some existing feature extraction methods are highly targeted, mainly for some specific radar emitter signals. For another thing, these methods rarely involve problems with noise effects and low SNR. In fact, radar signals, especially LPI radar signals, are inevitably subject to large amounts of noise during propagation and reception [5], and LPI radars usually have lower power and it is difficult to directly classify [6]. In current research, there have been some LPI waveform recognition technologies (LWRT), which use feature extraction and classification techniques. Time-frequency analysis (TFA) is widely used in the feature extraction since LPI radar signals are usually non-stationary signals, such as Smoothed Pseudo-Wigner Distribution (SPWD) [7], Wigner Ville Distribution (WVD) [8], Short-Time Fourier Transform (STFT) [9][10][11], and Choi-Williams Distribution (CWD) [6,[12][13][14][15][16]. Combined with deep learning in the field of computer vision [17] and models of neural network structures, researchers have obtained better recognition results from the time-frequency feature of signals [18]. The radar signal is first time-frequency transformed into a two-dimensional time-frequency image (TFI), which then is preprocessed and sent to a neural network for training. In the area of classifier design, classification methods include multi-Layer Perceptron (MLP) [11], conditional decision for different features [11,15], Convolutional Neural Networks (CNN) [14], Elman Neural Networks (ENN) [6], and support vector machines (SVM) [6,16]. In addition, there have been hy...
In order to reach the intelligent recognition, the deep learning classifiers adopted by radar waveform are normally trained with transfer learning, where the pretrained convolutional neural network on an external large-scale classification dataset (e.g., ImageNet) is used as the backbone. Though transfer learning could effectively avoid overfitting, transferred models are usually redundant and might not generalize well. To eliminate the dependence on transfer learning and achieve high generalization ability, this paper introduced neural architecture search (NAS) to search the suitable classifier of radar waveforms for the first time. Firstly, one of the innovative technologies in NAS called differentiable architecture search (DARTS) was used to design the classifier for 15 kinds of low probability intercept radar waveforms automatically. Then, a method with an auxiliary classifier called flexible-DARTS was proposed. By adding an auxiliary classifier in the middle layer, the flexible-DARTS has a better performance in designing well-generalized classifiers than the standard DARTS. Finally, the performance of the classifier in practical application was compared with related work. Simulation proves that the model based on flexible-DARTS has a better performance, and the accuracy rate for 15 kinds of radar waveforms can reach 79.2% under the −9 dB SNR which proved the effectiveness of the method proposed in this paper for the recognition of radar waveforms.
Emitter signal waveform recognition and classification are necessary survival techniques in electronic warfare systems. The emitters use various techniques for power management and complex intra-pulse modulations, which can create what looks like a noisy signal to an intercept receiver, so emitter signal waveform recognition at a low signal-to-noise ratio (SNR) has gained increased attention. In this study, we propose an autocorrelation feature image construction technique (ACFICT) combined with a convolutional neural network (CNN) to maintain the unique feature of each signal, and a structure optimization for CNN input layer called hybrid model is designed to achieve image enhancement of the signal autocorrelation, which is different from using a single image combined with CNN to complete classification. We demonstrate the performance of ACFICT by comparing feature images generated by different signal pre-processing algorithms, and the evaluation indicators are signal recognition rate, image stability degree, and image restoration degree. This paper simulates six types of the signals by combining ACFICT with three types of hybrid model, the simulation results compared with the literature show that the proposed methods not only has a high universality, but also better adapts to waveform recognition at low SNR environment. When the SNR is –6 dB, the overall recognition rate of the method reaches 88%.
Radar signals are emerging constantly for urgent task because of its complex patterns and rich working modes. For some radar waveforms with known modulation methods, they can be identified by correlation between radar prior knowledge and the received signals by the reconnaissance receiver. As for the unknown radar signals, how to identify unknown radar waveforms under the condition of limited samples and low signal-to-noise ratio is a challenging problem. Aiming at the learning ability of the deep features of the image by the convolutional neural network (CNN), the reconstructed features of the timefrequency image (TFI) of the known and unknown radar waveform signals have been excavated. A decision fusion unknown radar signal identification model based on transfer deep learning and linear weight decision fusion is designed in this paper. Firstly, the CNN is trained using the known radar signals; Then, based on the transfer learning, the neurons obtained from the multiple underlying the CNN are used to represent the reconstruction feature; Finally, the performance of the single random forest classifier of the original TFI and short-time autocorrelation features images (SAFI)are fused, the identification decision of unknown signals is realized by setting linear weight to the two databases. The recognition rate of unknown new classes for small samples exceeds 80.31%, and the classification accuracy rate for known radar waveform reach more than 99.15%. INDEX TERMS Unknown radar waveform recognition, convolutional neural network, decision fusion, transfer learning, random forest.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.