2020
DOI: 10.1016/j.swevo.2020.100663
|View full text |Cite
|
Sign up to set email alerts
|

A survey on swarm intelligence approaches to feature selection in data mining

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
109
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 309 publications
(109 citation statements)
references
References 101 publications
0
109
0
Order By: Relevance
“…In the following, each upperlevel solution (feature subset represented by a binary vector indicating selected features where "1" means that the corresponding feature is selected and "0" otherwise) is associated with an optimal lower-level solution (tree encoding a constructed feature that represents optimal combinations of feature subset). Both leader objectives could be conflicting as the minimization of the number of features may increase the classification-error-rate due to the removal of relevant features [60]. A similar observation could be seen for the follower objectives as maximizing the relevance (correlation with class labels) may increase the redundancy between the constructed features of the considered combination tree.…”
Section: Case Study: Multi-objective Bi-level Feature Constructionmentioning
confidence: 84%
“…In the following, each upperlevel solution (feature subset represented by a binary vector indicating selected features where "1" means that the corresponding feature is selected and "0" otherwise) is associated with an optimal lower-level solution (tree encoding a constructed feature that represents optimal combinations of feature subset). Both leader objectives could be conflicting as the minimization of the number of features may increase the classification-error-rate due to the removal of relevant features [60]. A similar observation could be seen for the follower objectives as maximizing the relevance (correlation with class labels) may increase the redundancy between the constructed features of the considered combination tree.…”
Section: Case Study: Multi-objective Bi-level Feature Constructionmentioning
confidence: 84%
“…As an emerging computing technology, swarm intelligence algorithm has attracted much attention due to its simplicity and global search capability [22] . Shao et al proposed a method based on combining Particle Swarm Optimization (PSO) to optimize the number of the hidden layer neurons, learning rate and the momentum in DBN, which was applied to analyze experimental signal of the rolling bearings, and PSO-DBN could recognize fault state more intuitively and accurately [23] .…”
Section: On This Basis Azami Et Al Proposed Refined Compositementioning
confidence: 99%
“…The population-based metaheuristicoptimization algorithm (Nguyen et al, 2020)known as Particle Swarm Optimization (PSO) algorithm is proposed by Kennedy and Eberhart(1995). PSO simulates the movement of birds thatare randomly looking for food in search space.…”
Section: B Optimization Algorithmmentioning
confidence: 99%
“…Xin-She ( 2010) was present Bat Algorithm (BA) (Nguyen et al, 2020). BA was developed that mimic the behavior of bat where according toecho to find the pray.…”
Section: B Optimization Algorithmmentioning
confidence: 99%