2018
DOI: 10.2991/ijcis.11.1.1
|View full text |Cite
|
Sign up to set email alerts
|

Bare bones particle swarm optimization with adaptive chaotic jump for feature selection in classification

Abstract: Feature selection (FS) is a crucial data pre-processing process in classification problems. It aims to reduce the dimensionality of the problem by eliminating irrelevant or redundant features while achieve similar or even higher classification accuracy than using all the features. As a variant of particle swarm optimization (PSO), Bare bones particle swarm optimization (BBPSO) is a simple but very powerful optimizer. However, it also suffers from premature convergence like other PSO algorithms, especially in h… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
8
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
6
3

Relationship

1
8

Authors

Journals

citations
Cited by 17 publications
(8 citation statements)
references
References 40 publications
0
8
0
Order By: Relevance
“…Binary variants of grasshopper optimization algorithm (GOA) were developed to solve feature selection task in [20]. Arora et al [5] select optimal feature subset in [32] and a novel chaotic jump operator was developed to enrich the search behavior and add more randomness into the search process. Gu et al [11] introduced a new variant of PSO, called competitive swarm optimization (CSO), to solve high-dimen sional feature selection problem.…”
Section: Related Workmentioning
confidence: 99%
“…Binary variants of grasshopper optimization algorithm (GOA) were developed to solve feature selection task in [20]. Arora et al [5] select optimal feature subset in [32] and a novel chaotic jump operator was developed to enrich the search behavior and add more randomness into the search process. Gu et al [11] introduced a new variant of PSO, called competitive swarm optimization (CSO), to solve high-dimen sional feature selection problem.…”
Section: Related Workmentioning
confidence: 99%
“…PSO, which was proposed by Eberhart and Kennedy in 1995 [41], is a simple and very powerful optimizer [31]. Recently, PSO and its variations are being widely employed for solving the feature selection problems, and many PSO-based approaches have shown promising results [42,43,44,3,45]. Thus, within the framework of PSO, a new self-adaptive parameter and strategy based particle swarm optimization (SPS-PSO) algorithm is proposed in this research.…”
Section: Introductionmentioning
confidence: 99%
“…SD and Euclidean distance measures are used to identify the significant features and then a combination of PSO–deep belief network is used to achieve better results for classification. Qiu 21 has implemented Bare bones particle swarm optimization (BBPSO) with Adaptive Chaotic Jump (ACJ) strategy for selecting global optimal features for classification. The occurrence of premature convergence is very much reduced by using this version of PSO implementation.…”
Section: Introductionmentioning
confidence: 99%