2021
DOI: 10.1016/j.patcog.2020.107804
|View full text |Cite
|
Sign up to set email alerts
|

Feature selection using bare-bones particle swarm optimization with mutual information

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
69
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 181 publications
(69 citation statements)
references
References 47 publications
0
69
0
Order By: Relevance
“…The selection of important and suitable identification features can not only simplify the calculation but also allow understand the causal relationship, which is a critical part of machine learning. The advantages of feature selection [39][40][41] include (1) reduce data collection cost, (2) improved data processing through the removal of redundant data and simplification of patterns, resulting in faster computing, (3) improved data interpretation since feature selection improves prediction results, and accelerates model derivation and knowledge discovery. The objective is to find out the most relevant classification features, reduce the dimensions, and correct training samples so as to select important and effective conditional attributes.…”
Section: Feature Selection Methodsmentioning
confidence: 99%
“…The selection of important and suitable identification features can not only simplify the calculation but also allow understand the causal relationship, which is a critical part of machine learning. The advantages of feature selection [39][40][41] include (1) reduce data collection cost, (2) improved data processing through the removal of redundant data and simplification of patterns, resulting in faster computing, (3) improved data interpretation since feature selection improves prediction results, and accelerates model derivation and knowledge discovery. The objective is to find out the most relevant classification features, reduce the dimensions, and correct training samples so as to select important and effective conditional attributes.…”
Section: Feature Selection Methodsmentioning
confidence: 99%
“…Song et al [ 33 ] introduced an FS approach based on a new variant of the PSO algorithm, called bare bones PSO. The main idea is to use a swarm initialization technique depending on label correlation.…”
Section: Related Workmentioning
confidence: 99%
“…All "birds" in the "flock" update their positions continuously to converge on the best position for the optimum solution [39], [40]. Some recent research have applied PSO for feature selection in classification, such as using multiobjective PSO for cost-based feature selection problems in classification [38], improving multi-objective PSO for a multi-label feature selection algorithm [41], studying a filter-based bare-bone PSO algorithm for unsupervised feature selection [42], applying bare-bones PSO with mutual information for feature selection algorithm [43]. PSO has showed as an effective feature selection method that enhance significantly the classification performance in research [44], [45].…”
Section: A Related Workmentioning
confidence: 99%