2023
DOI: 10.1016/j.compbiomed.2023.107025
|View full text |Cite
|
Sign up to set email alerts
|

NSICA: Multi-objective imperialist competitive algorithm for feature selection in arrhythmia diagnosis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2024
2024
2025
2025

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 12 publications
(5 citation statements)
references
References 23 publications
0
5
0
Order By: Relevance
“…This analysis revealed that RF achieved the highest accuracy (95.24%), precision (100%), sensitivity (89.47%), and specificity (100%) among all models. Furthermore, the proposed Random Forest model outperformed existing models referenced in the literature [8][9][10][11][12][13][14][15] across all performance metrics. To achieve this superior performance the authors employed standard data preprocessing and feature selection methods.…”
Section: Discussionmentioning
confidence: 84%
See 3 more Smart Citations
“…This analysis revealed that RF achieved the highest accuracy (95.24%), precision (100%), sensitivity (89.47%), and specificity (100%) among all models. Furthermore, the proposed Random Forest model outperformed existing models referenced in the literature [8][9][10][11][12][13][14][15] across all performance metrics. To achieve this superior performance the authors employed standard data preprocessing and feature selection methods.…”
Section: Discussionmentioning
confidence: 84%
“…The proposed approach has attained a remarkable accuracy rate of 95.24%, surpassing the performance of the existing model, exhibiting noticeable (i.e., 2.05%, 9.26%, 7.97%, 46.43%, 20.56%, 0.85%, 1.86%, and 3.44%) enhancement than the reference research work. Furthermore, the proposed approach has exhibited superior precision (100), sensitivity (89.47%), and specificity (100%) as compared to [8][9][10][11][12][13][14][15]. The comparison of the proposed approach with existing methods has been summarised in figure 8.…”
Section: Benchmarkingmentioning
confidence: 99%
See 2 more Smart Citations
“…An e cient feature selection algorithm can improve the e ciency and performance of downstream training models. The feature selection method sorts features through algorithms, takes the upstream feature sorting results as input, and generates candidate feature subsets through feature subset search methods such as forward search [12][13][14][15][16]. This process has the problem of excessive dependence on feature sorting results and high dimensionality caused by mechanically generating feature subsets [17].…”
Section: Introductionmentioning
confidence: 99%