IJPE 2018
DOI: 10.23940/ijpe.18.02.p9.280289
|View full text |Cite
|
Sign up to set email alerts
|

Relief Feature Selection and Parameter Optimization for Support Vector Machine based on Mixed Kernel Function

Abstract: In order to improve the classification performance of Support Vector Machine (SVM), Relief feature selection algorithm was used to obtain the most relevant feature subset and remove redundant features. The mixed kernel function, which combined the global kernel function with the local kernel function, was proposed to strengthen the learning ability and generalization performance of SVM. In addition, the parameter optimization of SVM, which combined Genetic Algorithm (GA) with grid search, was performed to redu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 24 publications
0
2
0
Order By: Relevance
“…The results of MIFS and other 11 dimensionality reduction methods on these four data sets are compared. The 11 methods used for comparison are: the sequential forward selection algorithm (SFS), 49 the Spearman's rank correlation coefficient algorithm (SC2), 50 the sparse group Lasso algorithm (SGL), 51 the mutual information maximization algorithm (MIM), 49 the adaptive sparse group Lasso algorithm based on conditional mutual information (ASGL‐CMI), 52 the distributed ranking filter approach removing the features with information gain zero from the ranking and correlation‐based feature selection algorithm (DRFO‐CFS), 52 the neighborhood rough set‐based reduction algorithm (NRS), 32 locally linear embedding and neighborhood rough set (LLE‐NRS), 53 the Relief algorithm combined with the NRS algorithm (Relief‐NRS), 54 the gene selection algorithm based on Fisher linear discriminant and neighborhood rough set (FLD‐NRS), 18 and the fuzzy backward feature elimination algorithm (FBFE) 55 12 and 13 show, respectively, the size and SVM accuracy of the reduction subset selected by the 12 methods on the four data sets, where the meaning of the symbol “/” is the same as that in Tables 9–11.…”
Section: Experiments and Analysismentioning
confidence: 99%
“…The results of MIFS and other 11 dimensionality reduction methods on these four data sets are compared. The 11 methods used for comparison are: the sequential forward selection algorithm (SFS), 49 the Spearman's rank correlation coefficient algorithm (SC2), 50 the sparse group Lasso algorithm (SGL), 51 the mutual information maximization algorithm (MIM), 49 the adaptive sparse group Lasso algorithm based on conditional mutual information (ASGL‐CMI), 52 the distributed ranking filter approach removing the features with information gain zero from the ranking and correlation‐based feature selection algorithm (DRFO‐CFS), 52 the neighborhood rough set‐based reduction algorithm (NRS), 32 locally linear embedding and neighborhood rough set (LLE‐NRS), 53 the Relief algorithm combined with the NRS algorithm (Relief‐NRS), 54 the gene selection algorithm based on Fisher linear discriminant and neighborhood rough set (FLD‐NRS), 18 and the fuzzy backward feature elimination algorithm (FBFE) 55 12 and 13 show, respectively, the size and SVM accuracy of the reduction subset selected by the 12 methods on the four data sets, where the meaning of the symbol “/” is the same as that in Tables 9–11.…”
Section: Experiments and Analysismentioning
confidence: 99%
“…To further verify the reduction performance and classification ability of the BONJE algorithm, this part of the experiment compares the BONJE algorithm with other 10 reduction algorithms from the perspective of the number of selected features and SVM classification accuracy on 3 representative tumor data sets (Colon, Leukemia, Lung). The ten different dimensionality reduction methods are: (1) the neighborhood rough set-based reduction algorithm (NRS) [35], (2) feature selection algorithm with Fisher linear discriminant (FLD-NRS) [32], (3) the gene selection algorithm based on locally linear embedding (LLE-NRS) [43], (4) the Relief algorithm [44] combined with the NRS algorithm(Relief + NRS) [35], (5) the fuzzy back-ward feature algorithm (FBFE) [44], (6) the binary differential evolution algorithm (BDE) [2], (7) the sequential forward selection algorithm (SFS) [29], (8) the Spearman's rank correlation coefficient algorithm (SC2) [36], (9) the mutual information maximization algorithm (MIM) [2], (10) feature selection algorithm with the Fisher score based on decision neighborhood entropy (FSDNE) [18]. Tables 13 and 14 show the experimental results of 11 dimensionality reduction algorithms.…”
Section: Comparison Of Bonje Algorithm and Multiple Dimensionality Reduction Algorithmsmentioning
confidence: 99%