2018
DOI: 10.21873/cgp.20063
|View full text |Cite
|
Sign up to set email alerts
|

Applications of Support Vector Machine (SVM) Learning in Cancer Genomics

Abstract: Abstract. Machine learning with maximization (support) of separating margin (vector), called support vector machine (SVM) learning, is a powerful classification tool that hasMachine learning (ML) "learns" a model from past data in order to predict future data (1). The key process is the learning which is one of the artificial intelligences. Many different statistical, probabilistic, and optimization techniques can be implemented as the learning methods such as the logistic regression, artificial neural network… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

2
435
0
8

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 728 publications
(445 citation statements)
references
References 61 publications
2
435
0
8
Order By: Relevance
“…Specifically, as for the small samples like the circumstance in this study, SVM can usually get favorable results using the limited datasets in the training set [39][40][41]. Besides, the generalizability of the SVM classifier is also remarkable in terms of the small and limited datasets [39,40].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Specifically, as for the small samples like the circumstance in this study, SVM can usually get favorable results using the limited datasets in the training set [39][40][41]. Besides, the generalizability of the SVM classifier is also remarkable in terms of the small and limited datasets [39,40].…”
Section: Discussionmentioning
confidence: 99%
“…Although the SVM classifier has several drawbacks, including the apparent complexity increase and large time consumption for large database [39,40], its merits are also very apparent. Specifically, as for the small samples like the circumstance in this study, SVM can usually get favorable results using the limited datasets in the training set [39][40][41]. Besides, the generalizability of the SVM classifier is also remarkable in terms of the small and limited datasets [39,40].…”
Section: Discussionmentioning
confidence: 99%
“…The hyperplane is orientated that it is as far as possible from the nearest data points from each class. These nearest points are called the support vectors (27).…”
Section: Classification Analysis Using Svmmentioning
confidence: 99%
“…This decision boundary, also referred to as "hyperplane", is orientated in such a way that it is as far as possible from the closest data points from each of the classes. These closest points are called support vectors [23]. Unlike other algorithms based on nonlinear optimization, the danger of getting trapped in local minima is low and the solution is unique and globally optimal [24,25].…”
Section: Imaging Parametersmentioning
confidence: 99%