2014 International Conference on IT Convergence and Security (ICITCS) 2014
DOI: 10.1109/icitcs.2014.7021732
|View full text |Cite
|
Sign up to set email alerts
|

An Evaluation of Feature Selection Technique for Dendrite Cell Algorithm

Abstract: Dendrite cell algorithm needs appropriates feature to represents its specific input signals. Although there are many feature selection algorithms have been used in identifying appropriate features for dendrite cell signals, there are algorithms that never been investigated and limited work to compare performance among them. In this study, six feature selection algorithms namely Information Gain, Gain Ratio, Symmetrical Uncertainties, Chi Square, Support Vector Machine, and Rough Set with Genetic Algorithm Redu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 9 publications
0
2
0
Order By: Relevance
“…In this work, two experiments are performed to study the feasibility and superiority of the proposed approach. In the first experiment, GGA-DCA and state-of-the-art DCA expansion algorithms (NIDDCA [1], FLA-DCA [17], GA-PSM [29], and the SVM-DCA [20]) perform classification tasks on the 24 data sets. For the purpose of baseline comparison, the well-known classifiers, the KNN, the DT, the XGboost, the RF, and the ERT, also perform classification tasks on the 24 data sets.…”
Section: Experiments Setupmentioning
confidence: 99%
See 1 more Smart Citation
“…In this work, two experiments are performed to study the feasibility and superiority of the proposed approach. In the first experiment, GGA-DCA and state-of-the-art DCA expansion algorithms (NIDDCA [1], FLA-DCA [17], GA-PSM [29], and the SVM-DCA [20]) perform classification tasks on the 24 data sets. For the purpose of baseline comparison, the well-known classifiers, the KNN, the DT, the XGboost, the RF, and the ERT, also perform classification tasks on the 24 data sets.…”
Section: Experiments Setupmentioning
confidence: 99%
“…However, it is essential to realize that relevance/importance according to those definitions does not imply membership in the optimal feature subset, and irrelevance does not mean that a feature can not be in the optimal feature subset [18]. In addition, machine learning algorithms, such as K-Nearest Neighbors (KNN) [19] and Support Vector Machine (SVM) [20], are employed for input signal generation. Moreover, Zhou et al [1] utilized numerical differential to extract features based on the data changes of the selected features.…”
Section: Introductionmentioning
confidence: 99%