Seventh International Conference on Intelligent Systems Design and Applications (ISDA 2007) 2007
DOI: 10.1109/isda.2007.101
|View full text |Cite
|
Sign up to set email alerts
|

Two-Step Particle Swarm Optimization to Solve the Feature Selection Problem

Abstract: In this paper we propose a new model of ParticleSwarm Optimization called Two-Step PSO. The basic idea is to split the heuristic search performed by particles into two stages. We have studied the performance of this new algorithm for the Feature Selection problem by using the reduct concept of the Rough Set Theory. Experimental results obtained show that the Two-step approach improves over the PSO model in calculating reducts, with the same computational cost.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
38
0

Year Published

2010
2010
2021
2021

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 64 publications
(38 citation statements)
references
References 22 publications
0
38
0
Order By: Relevance
“…Wei et al [26] proposed a mutation enhanced BPSO algorithm by adjusting the memory of local and global optimum (LGO) and increasing the particles' mutation probability for feature selection to overcome convergence premature problem and achieve high-quality features. The other research on feature subset selection using PSO can be found in [27][28][29].…”
Section: Literature Reviewmentioning
confidence: 99%
“…Wei et al [26] proposed a mutation enhanced BPSO algorithm by adjusting the memory of local and global optimum (LGO) and increasing the particles' mutation probability for feature selection to overcome convergence premature problem and achieve high-quality features. The other research on feature subset selection using PSO can be found in [27][28][29].…”
Section: Literature Reviewmentioning
confidence: 99%
“…However, standard implementations for feature selection fail or their performance is severely affected in selecting small feature subsets from large datasets (see section 3.1 for some grounds). For these reasons, some authors have adapted the standard binary PSO (sbPSO) presented by (Eberhart et al, 2001) implementing ad-hoc algorithms to overcome the dataset size problem (Wang et al, 2007;Chuang et al, 2008;Xu et al, 2007;Takahashi Monteiro and Yukio, 2007;Bello et al, 2007;Alba et al, 2007). Nevertheless, these implementations are mainly focused on maximizing the evaluation accuracy and little or no consideration is done to minimize the number of features selected (for an exception see (Alba et al, 2007)).…”
Section: Theory and New Applications Of Swarm Intelligence 134mentioning
confidence: 99%
“…Some authors have proposed modifications to sbPSO algorithm in order to improve performance in terms of search and evaluation capabilities (Chuang et al, 2008;Xu et al, 2007;Wang et al, 2007;Takahashi Monteiro and Yukio, 2007;Bello et al, 2007). However, these implementations are mainly focused in maximizing the evaluation accuracy.…”
Section: Others Implementations For Feature Selection Using Psomentioning
confidence: 99%
“…The movement of the particle is realized by the flip of the bit value and the velocity is no longer a change ratio of its position but a change probability of it. We propose expression (12) in [4] to calculate the jdimension of the i-th particle. This is based on the position and velocity update equations of the particle as shown in [13] and [30].…”
Section: Particle Swarm Optimizationmentioning
confidence: 99%
“…Some of them have exhibited good results, mainly attained by using ACO-or PSO-based approaches, such as [11,1,2,24,25,26]. In [3] and [4], a new approach to feature selection based on the ACO and PSO methodologies is presented. The chief thought is the split of the search process accomplished by the agents (ants or particles) into two stages, such that an agent is commanded in the first stage to find a partial solution to the problem, which in turn is afterwards used as an initial state during the upcoming phase.…”
Section: Introductionmentioning
confidence: 99%