2021
DOI: 10.1109/access.2021.3056407
|View full text |Cite
|
Sign up to set email alerts
|

Metaheuristic Algorithms on Feature Selection: A Survey of One Decade of Research (2009-2019)

Abstract: Feature selection is a critical and prominent task in machine learning. To reduce the dimension of the feature set while maintaining the accuracy of the performance is the main aim of the feature selection problem. Various methods have been developed to classify the datasets. However, metaheuristic algorithms have achieved great attention in solving numerous optimization problem. Therefore, this paper presents an extensive literature review on solving feature selection problem using metaheuristic algorithms wh… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
235
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 373 publications
(236 citation statements)
references
References 264 publications
(253 reference statements)
0
235
0
1
Order By: Relevance
“…The above-mentioned algorithms are able to provide better solutions for most of the feature selection problems [33], [34]. However, these algorithms are unable to provide optimal feature subset for high dimensional datasets.…”
Section: Related Workmentioning
confidence: 99%
“…The above-mentioned algorithms are able to provide better solutions for most of the feature selection problems [33], [34]. However, these algorithms are unable to provide optimal feature subset for high dimensional datasets.…”
Section: Related Workmentioning
confidence: 99%
“…e dataset for solving Android malware classification in this paper is divided into two parts: one is obtained by decompiling, with a total of 2720 samples, 532 malicious application samples from the Canadian Institute of Network Security [41], which contain typical Android platform malware, such as ransomware malicious applications, threatening SMS applications, and advertising (3) for each nest in population do (4) for each dimension in one nest do (5) Randomly assign values to two populations of individuals based on [Lb, Ub]. Updating populations based on chaotic maps equations ( 7) and (8) ( 6) end (7) Convert two populations to binary using equations ( 3) and (4) (8) Calculate the fitness for individuals of two populations by equation ( 12) (9) Update the nest_fitness ( 10) end (11) Both populations perform Algorithm 3 (12) while t ≀ max_iterations or stop criteria do (13) for each nest in populations do (14) perform levy flights to generate new populations use equation ( 9) and (10) ( 15) end (16) for each nest in populations do (17) ⊳ Rectification procedure (18) Normalize the population value generated by LĂ©vy flight (19) Binary conversion ( 20) end (21) en execute Algorithm 2 ( 22) end (23) if nest_fitness < mut_fitness then (24) repalce the nest with mut_pop(i) ( 25 applications, and 400 malicious samples from Dr. Wang's repository dataset (http://infosec.bjtu.edu.cn/wangwei/? page_idïżœ85); benign apps were mainly downloaded through Google Play Python crawlers to obtain a total of 188 APKs from the Xiaomi App Market, which were released to the App Market after security testing.…”
Section: Datasetsmentioning
confidence: 99%
“…Traditional wrapper methods are based on improvements of machine learning classifiers, greedy search, and other methods [6], but these methods tend to fall into local optima and are computationally expensive. Nowadays, nature-inspired algorithms are widely used to solve optimization problems, which are inspired by nature and are increasingly used by scholars to find the optimal value in the problem due to their conceptual simplicity and ease of implementation [7]. A large number of contributions have been made using natureinspired algorithms to solve feature selection problems.…”
Section: Introductionmentioning
confidence: 99%
“…In this sense, one can apply various heuristic search strategies such as hill climbing and best first [25] to search the feature subset space in a reasonable time. Metaheuristic algorithms such as Simulated Annealing (SA) [29], Genetic Algorithm (GA) [30], and Particle Swarm Optimization (PSO) [31] have also been applied efficiently as search-based feature selection approaches. Recently, researchers have explored strategies that design parallel algorithms to improve the running time of their feature selection approach, as proposed by Huang et al [32] for internet text classification.…”
Section: B Feature Selection In Classificationmentioning
confidence: 99%