2022
DOI: 10.21203/rs.3.rs-1121838/v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Application of Nature Inspired Soft Computing Techniques for Gene Selection: A Novel Frame Work for Classification of Cancer.

Abstract: A modified Artificial Bee Colony (ABC) metaheuristics optimization technique is applied for cancer classification, that reduces the classifier's prediction errors and allows for faster convergence by selecting informative genes. Cuckoo search (CS) algorithm was used in the onlooker bee phase (exploitation phase)of ABC to boost performance by maintaining the balance between exploration and exploitation of ABC. Tuned the modified ABC algorithm by using Naïve Bayes (NB) classifiers to improve the further accurac… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 45 publications
(13 reference statements)
0
4
0
Order By: Relevance
“…The proposed work was benchmarked on four benchmark datasets where it showed significant improvement in fitness value when compared with the swarm intelligence technique and various bi-clustering algorithms [39].…”
Section: Literature Reviewmentioning
confidence: 99%
“…The proposed work was benchmarked on four benchmark datasets where it showed significant improvement in fitness value when compared with the swarm intelligence technique and various bi-clustering algorithms [39].…”
Section: Literature Reviewmentioning
confidence: 99%
“…The study contributes to high accuracy performance compared to the previously published feature selection techniques. The classification of cancer-based on gene expression using a novel framework was proposed [29]. The ABC-based modified metaheuristics optimization technique was applied for the classification task.…”
Section: Related Workmentioning
confidence: 99%
“…In addition, the models received 94.10 percent for weighted recalls, accuracy, and F1-scores, and 94.10 percent for weighted recalls, accuracy, and F1scores. Table 7 shows a comparison of the model with and without feature selection [37][38][39][40][41][42][43][44][45][46][47].…”
Section: Description Of K-fold Cross-validationmentioning
confidence: 99%