2022
DOI: 10.20944/preprints202201.0232.v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Tutorial on Support Vector Machines

Abstract: The aim of this tutorial is to help students grasp the theory and applicability of support vector machines (SVMs). The contribution is an intuitive style tutorial that helped students gain insights into SVM from a unique perspective. An internet search will reveal many videos and articles on SVM, but many of them give simplified explanations that leave gaps in the derivations that beginning students cannot fill. Most free tutorials lack guidance on practical applications and considerations. The software wrappe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
9
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 7 publications
(9 citation statements)
references
References 5 publications
0
9
0
Order By: Relevance
“…According to [24], the support vector machine (SVM) has a solution that the maximum margin classifier or hyper-plane concept can overcome the prediction classification problems that occur in other linear classifiers. In classifying the correct prediction of the SVM method, the following mathematical calculations are known.…”
Section: E Support Vector Machinementioning
confidence: 99%
“…According to [24], the support vector machine (SVM) has a solution that the maximum margin classifier or hyper-plane concept can overcome the prediction classification problems that occur in other linear classifiers. In classifying the correct prediction of the SVM method, the following mathematical calculations are known.…”
Section: E Support Vector Machinementioning
confidence: 99%
“…The concept is to separate data across a decision boundary (hyperplanes) determined by a small subset of the data (feature vectors). The data subset that supports the decision boundary is called the support vector ( 38 ). The SVM assumes that the multi-feature data are linearly separable in the input space.…”
Section: Model Developmentmentioning
confidence: 99%
“…This analysis uses the radial kernel. Bridgelall ( 37 ), Mountrakis et al ( 38 ), and Yoonsuh and Hu ( 40 ) explain the use of the kernel trick in more detail.…”
Section: Model Developmentmentioning
confidence: 99%
See 1 more Smart Citation
“…The SVM then uses that hyperplane to predict the class of a new data object once presented with its feature vector. See Figure 4 [26].…”
Section: Model Buildingmentioning
confidence: 99%