2020
DOI: 10.1109/access.2020.3012838
|View full text |Cite
|
Sign up to set email alerts
|

A Hybrid Improved Dragonfly Algorithm for Feature Selection

Abstract: Feature selection, which eliminates irrelevant and redundant features, is one of the most efficient classification methods. However, searching for an optimal subset from the original set is still a challenging problem. This paper proposes a novel feature selection algorithm named hybrid improved dragonfly algorithm (HIDA) which combines the advantages of both mRMR and improved dragonfly algorithm (IDA) in order to generate promising candidate subset and achieve higher classification accuracy rate. Firstly, to … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
23
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 34 publications
(24 citation statements)
references
References 52 publications
0
23
0
1
Order By: Relevance
“…For the DLBCL dataset, the work mentioned in [ 14 , 26 , 30 ] attained an accuracy of 100% but the number of features selected by the authors were greater than the number of features selected by our proposed method. They utilized 16, 6, and 8 features, respectively, whereas the proposed method selected only four features as the optimal feature set.…”
Section: Resultsmentioning
confidence: 86%
See 3 more Smart Citations
“…For the DLBCL dataset, the work mentioned in [ 14 , 26 , 30 ] attained an accuracy of 100% but the number of features selected by the authors were greater than the number of features selected by our proposed method. They utilized 16, 6, and 8 features, respectively, whereas the proposed method selected only four features as the optimal feature set.…”
Section: Resultsmentioning
confidence: 86%
“…It can be seen from Table 6 that all the other works achieved less accuracy while utilizing more features than our proposed method. The authors of [ 11 , 14 , 15 ] selected a very large number of features (236, 135, and 169 features, respectively) as an optimal feature set and achieved 82.96%, 74.77%, and 88.72% accuracies, respectively. Authors of the work [ 12 ] succeeded in selecting a comparatively lesser number of features (30 features), but the accuracy obtained was 85.58%.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…The BDA algorithm was applied to different learning algorithm such as Naik et al [194] used BDA with radial basis neural network function and selected the features from microarray gene data. Several other binary versions of DA have been proposed to solve features selection problem that can be found in [195]- [198].…”
Section: B Swarm Intelligence Based Algorithmsmentioning
confidence: 99%