2019
DOI: 10.1109/lwc.2018.2875001
|View full text |Cite
|
Sign up to set email alerts
|

Automatic Modulation Classification Using Cyclic Correntropy Spectrum in Impulsive Noise

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
21
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 64 publications
(21 citation statements)
references
References 14 publications
0
21
0
Order By: Relevance
“…Nonetheless, this method required a complex computation process due to the lack of capability in handling large datasets in the SVM classifier. In another research, the cyclic correntopy spectrum-based MC approach was introduced by Ma et al [28]. Cycle frequencies (CF) were utilized to classify the modulation type of the signal.…”
Section: Automatic Modulation Classification (Amc) Without Machine Lementioning
confidence: 99%
“…Nonetheless, this method required a complex computation process due to the lack of capability in handling large datasets in the SVM classifier. In another research, the cyclic correntopy spectrum-based MC approach was introduced by Ma et al [28]. Cycle frequencies (CF) were utilized to classify the modulation type of the signal.…”
Section: Automatic Modulation Classification (Amc) Without Machine Lementioning
confidence: 99%
“…The algorithms proposed in this paper are suitable for Gaussian noise and non-Gaussian noise. No matter what kind of noise in the input signal is closer to, the proposed algorithm has good performance and strong inclusiveness to noise [13] [14].…”
Section: Introductionmentioning
confidence: 99%
“…A variety of features were extracted and employed in [6]- [18], containing amplitude with phase and carrier frequency [6], instantaneous features [7], high-order statistical features [8], [9], cyclic spectrum parameters [10], [11], bispectrum features [12], wavelet features [13], [14] and constellation diagram [15], [16]. For the choice of the training classifier, the classifiers based on machine learning like support vector machine (SVM) in [6], [13], [17], [18], decision tree in [7], [8], [14], k nearest neighbor (KNN) in [10], compressive sensing in [12], genetic algorithm in [15], and neural network (NN) in [9], [11], [16], were widely used due to their robustness, self-adaption, and nonlinear processing ability [19]. However, the performance of these PR methods largely depends on empirical feature extraction due to the limited capacity of classifiers [19].…”
Section: Introductionmentioning
confidence: 99%