2019
DOI: 10.1016/j.ins.2019.07.026
|View full text |Cite
|
Sign up to set email alerts
|

A robust swarm intelligence-based feature selection model for neuro-fuzzy recognition of mild cognitive impairment from resting-state fMRI

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
11
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 42 publications
(12 citation statements)
references
References 32 publications
1
11
0
Order By: Relevance
“…One of the reasons that degraded the system's overall performance to achieve only average accuracy of 93.20% is the low convergence rate. Anter et al [14] choose a chaotic based binary GWO technique for elimination of irrelevant features, which tries for dimensionality reduction deprived of losing classification accuracy. Similarly, Abdel-Basset et al [15] suggested a GWO algorithm combined to form dual-phase mutant technique to overcome the reduction of features with wrapper techniques for classification tasks, by employing dual-phase mutant approach improves the system's exploitability.…”
Section: Related Workmentioning
confidence: 99%
“…One of the reasons that degraded the system's overall performance to achieve only average accuracy of 93.20% is the low convergence rate. Anter et al [14] choose a chaotic based binary GWO technique for elimination of irrelevant features, which tries for dimensionality reduction deprived of losing classification accuracy. Similarly, Abdel-Basset et al [15] suggested a GWO algorithm combined to form dual-phase mutant technique to overcome the reduction of features with wrapper techniques for classification tasks, by employing dual-phase mutant approach improves the system's exploitability.…”
Section: Related Workmentioning
confidence: 99%
“…Naive Bayes classifiers have been widely used in many applications [1][2][3][4][5][6][7][8][9][10][11]. To improve the classification performance, many advanced NB classifiers have been developed, which can be divided into five categories.…”
Section: Naive Bayes Classifiersmentioning
confidence: 99%
“…Naive Bayes (NB) has been widely used in many machine-learning tasks because of its simplicity and efficiency [1][2][3][4][5][6][7][8][9][10][11]. It could well handle different data types such as numerical and categorical ones.…”
Section: Introductionmentioning
confidence: 99%
“…In this approach, the Gaussian mutation operator and chaotic local search method are introduced to enhance the performance of the original algorithm. Anter et al [36] use a chaotic binary grey wolf optimization (GWO) approach as the feature selection model that attempts to reduce the number of features without loss of significant information for classification. Abdel-Basset et al [37] propose a GWO algorithm integrated with a two-phase mutation strategy to solve the feature selection for classification problems based on the wrapper methods, and the two-phase mutation enhances the exploitation capability of the algorithm.…”
Section: Related Workmentioning
confidence: 99%