2010
DOI: 10.1016/j.eswa.2010.03.016
|View full text |Cite
|
Sign up to set email alerts
|

Feature selection with Intelligent Dynamic Swarm and Rough Set

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
42
0
1

Year Published

2012
2012
2019
2019

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 89 publications
(43 citation statements)
references
References 11 publications
0
42
0
1
Order By: Relevance
“…This hybrid algorithm uses the ability of SA to avoid being trapped in a local minimum and high rate convergence of the crossover operator of GA. PSO algorithm is also attractive for feature selection in that particles will discover best feature combinations as they fly within the subset space. In this way, Bae et al [87] have proposed a modified PSO called intelligent dynamic swarm (IDS). Niu et al [88] have introduced a hybrid optimizing algorithm, called AFSA-TSGM.…”
Section: Related Workmentioning
confidence: 99%
“…This hybrid algorithm uses the ability of SA to avoid being trapped in a local minimum and high rate convergence of the crossover operator of GA. PSO algorithm is also attractive for feature selection in that particles will discover best feature combinations as they fly within the subset space. In this way, Bae et al [87] have proposed a modified PSO called intelligent dynamic swarm (IDS). Niu et al [88] have introduced a hybrid optimizing algorithm, called AFSA-TSGM.…”
Section: Related Workmentioning
confidence: 99%
“…CREDIT 20 8 (3) 9 (15) 10 (2) 8 (16) 9 (4) 8 (20) 8 (19) 9 (1)) DERM2 34 9 (16) 10 (4) 8 (10) 9 (10) 8 (5) 9 (15) 8 (12) 9 (8) WQ 38 13 (4) 14 (15) 15 (1) 13 (16) 14 (4) 12 (13) 13 (7) 12 (7) 13 (11) 14 (2) …”
Section: Rsfs Resultsunclassified
“…Obviously, this is an exhaustive search, and is impractical for large data sets [70]. To manage the complexity of the search process, several stochastic optimization methods, such as hill climbing with forward selection and backward elimination [62] and meta-heuristic methods such as the genetic algorithm [62], particle swarm optimization [7,65], ant colony optimization [22,71], as well as great deluge and non-linear great deluge [1,34] can be used. In this study, we have used MRMC-IWD and hybrid MRMC-IWD to tackle this challenging problem.…”
Section: The Rough Set Feature Subset Selection (Rsfs) Problemmentioning
confidence: 99%
“…Furthermore, an information system can have more than one reduced attribute set. The set obtained from the intersection of the reduced sets derived from an information system is called a core attribute set of the A attribute set [21,22]. The core attribute set can also be derived from the discernibility matrix.…”
Section: Attribute Reduction and Core Attributesmentioning
confidence: 99%