2020
DOI: 10.1088/1742-6596/1442/1/012027
|View full text |Cite
|
Sign up to set email alerts
|

Comparison of Support Vector Machine Recursive Feature Elimination and Kernel Function as feature selection using Support Vector Machine for lung cancer classification

Abstract: Cancer is the uncontrolled growth of abnormal cell that need a proper treatment. Cancer is second leading cause of death according to the World Health Organization in 2018. There are more than 120 types of cancer, one of them is lung cancer. Cancer classification has been able to maximize diagnosis, treatment, and management of cancer. Many studies have examined the classification of cancer using microarrays data. Microarray data consists of thousands of features (genes) but only has dozens or hundreds of samp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 19 publications
(9 citation statements)
references
References 11 publications
0
8
0
1
Order By: Relevance
“…SVM is one of the basic approaches for supervised learning. Additionally, it is widely used in classification and regression applications, and also frequently in clustering [ 43 , 44 ], feature selection [ 45 47 ], feature extraction [ 48 , 49 ], etc. SVM, based on the statistical learning theory [ 50 , 51 ], is a distribution independent learning algorithm since it does not require joint distribution function information.…”
Section: Methodsmentioning
confidence: 99%
“…SVM is one of the basic approaches for supervised learning. Additionally, it is widely used in classification and regression applications, and also frequently in clustering [ 43 , 44 ], feature selection [ 45 47 ], feature extraction [ 48 , 49 ], etc. SVM, based on the statistical learning theory [ 50 , 51 ], is a distribution independent learning algorithm since it does not require joint distribution function information.…”
Section: Methodsmentioning
confidence: 99%
“…Here we take the advantage of SVM as a powerful supervised machine learning technique suited to our experimental work, as it is based on statistical learning theory and can be effectively trained to classify face biometric data by determining an optimal set of support vectors, which are members of the labeled face training data samples and nominated to form a discriminative SVM classifier. The main objective of an SVM is to find optimal hyperplanes to separate data with a maximum margin [34]. SVM can separate linear and non-linear data by applying different kernel functions such as linear function, Radial Basis Function (RBF), polynomial function, etc.…”
Section: Training Svm Classifiermentioning
confidence: 99%
“…To investigate the validity and potency of the dynamic fusion framework for improving the recognition performance, we train and use an SVM-based classifier for experimenting the proposed dynamic fusion approaches. The idea of SVM is based on structural risk minimization that tries to find an optimal hyperplane that maximizes the margin between classes [59,60]. The separation can be tuned by the C value (known as regularization parameter or penalty factor), which refers to the softness of the margin.…”
Section: Classifier Trainingmentioning
confidence: 99%