2020
DOI: 10.1109/access.2020.3006473
|View full text |Cite
|
Sign up to set email alerts
|

Improved Harris Hawks Optimization Using Elite Opposition-Based Learning and Novel Search Mechanism for Feature Selection

Abstract: The rapid increase in data volume and features dimensionality have a negative influence on machine learning and many other fields, such as decreasing classification accuracy and increasing computational cost. Feature selection technique has a critical role as a preprocessing step in reducing these issues. It works by eliminating the features that may negatively influence the classifiers' performance, such as irrelevant, redundant and less informative features. This paper aims to introduce an improved Harris ha… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
51
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
1

Relationship

3
5

Authors

Journals

citations
Cited by 102 publications
(51 citation statements)
references
References 46 publications
0
51
0
Order By: Relevance
“…The importance of this work comes from the fact that the Harris Hawks Optimization algorithm has been applied in many fields such as image processing [36], Optimal Power Flow Problem [37], drug design, and discovery [38]. Also, HHO applied in feature selection by using the Elite Opposition-Based Learning method [39] and provide good results. In the following section, basics and background about the HHO algorithm.…”
Section: Related Workmentioning
confidence: 99%
“…The importance of this work comes from the fact that the Harris Hawks Optimization algorithm has been applied in many fields such as image processing [36], Optimal Power Flow Problem [37], drug design, and discovery [38]. Also, HHO applied in feature selection by using the Elite Opposition-Based Learning method [39] and provide good results. In the following section, basics and background about the HHO algorithm.…”
Section: Related Workmentioning
confidence: 99%
“…As mentioned earlier, hybridized algorithms are created by combining two optimization algorithms to overcome the demerits of one algorithm and taking advantage of the merits of the other. However, most of the hybridized algorithms still suffer from premature convergence depending on the original optimization algorithms mechanism [47]. The key point is to find a good match between the selected algorithms while creating the hybrid to obtain a more efficient variant.…”
Section: Introductionmentioning
confidence: 99%
“…Generally speaking, there are two main types of feature selection methods: filter-based methods and wrapper-based methods. Chi-Square (Chi) and Information Gain (IG) are some examples of filter-based methods [2]. On the other hand, the wrapper-based methods mainly work based on using an optimisation algorithm to select the optimal subset of features.…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, in wrapper mode, there is a direct interaction between the features and the used classifier. In the wrapper-based method, a fitness function will be used by the optimisation algorithm to find the quality of the selected features taking into account the achieved classification accuracy and the number of selected features [2].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation