2023
DOI: 10.23919/cje.2021.00.347
|View full text |Cite
|
Sign up to set email alerts
|

Recursive Feature Elimination Based Feature Selection in Modulation Classification for MIMO Systems

Abstract: The feature-based (FB) algorithms are widely used in modulation classification due to their low complexity. As a prerequisite step of FB, feature selection can reduce the computational complexity without significant performance loss. In this paper, according to the linear separability of cumulant features, the hyperplane of the support vector machine is used to classify modulation types, and the contribution of different features is ranked through the weight vector. Then, cumulant features are selected using r… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 9 publications
(8 citation statements)
references
References 26 publications
0
8
0
Order By: Relevance
“…Liu J [ 31 ] proposed the random forest algorithm with weights, introduced weighting techniques in the construction of decision trees, and used voting with weights in the decision process to improve the prediction ability. The feature selection method represents the objects in the original dataset with a subset of features and removes redundant feature information [ 32 ]. Parlak B [ 33 ] extracted more representative and discriminative features, which effectively improved the classification accuracy.…”
Section: Introductionmentioning
confidence: 99%
“…Liu J [ 31 ] proposed the random forest algorithm with weights, introduced weighting techniques in the construction of decision trees, and used voting with weights in the decision process to improve the prediction ability. The feature selection method represents the objects in the original dataset with a subset of features and removes redundant feature information [ 32 ]. Parlak B [ 33 ] extracted more representative and discriminative features, which effectively improved the classification accuracy.…”
Section: Introductionmentioning
confidence: 99%
“…Although they are computationally expensive, wrapper methods like Recursive Feature Elimination often outperform filter methods. The downside is that they are prone to overfitting and are not suitable for high-dimensional datasets due to computational intensity levels 8 10 .…”
Section: In-depth Review Of Existing Machine Learning Models Used For...mentioning
confidence: 99%
“…Recursive elimination(RFE) is a feature selection algorithm [6] ,It used a base model for multiple rounds of training, after training of each round, delete the weak features until get enough features demand, the base model we chose was SVR. The feature subsets were filtered out through recursive elimination method are shown in Table 1.…”
Section: Feature Selection Based On Recursive Eliminationmentioning
confidence: 99%
“…The results of the XGBoost prediction model for octane number (RON) loss were shown in Figure 1. From Figure 1, it can be seen that the alternative parameters of the XGBoost prediction model for octane number (RON) loss were: number of base trainers: [100,200,300,400,500,600,700]; The maximum depth of the decision tree [1,2,3,4,5,6,7];…”
Section: Establishment and Evaluation Of Modelsmentioning
confidence: 99%