2016
DOI: 10.3233/ida-150348
|View full text |Cite
|
Sign up to set email alerts
|

EuDiC SVM: A novel support vector machine classification algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
19
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(19 citation statements)
references
References 21 publications
0
19
0
Order By: Relevance
“…IBk and K* are simple MLTs that work well on basic recognition problems. One of the shortcomings of the these two algorithms is that they are lazy learners, that is, they do not learn anything from the training data but simply use the training data itself for prediction because of which they are not robust for predicting noisy data . Due to this reason SVM and RF performed better than these MLTs in our study.…”
Section: Discussionmentioning
confidence: 83%
“…IBk and K* are simple MLTs that work well on basic recognition problems. One of the shortcomings of the these two algorithms is that they are lazy learners, that is, they do not learn anything from the training data but simply use the training data itself for prediction because of which they are not robust for predicting noisy data . Due to this reason SVM and RF performed better than these MLTs in our study.…”
Section: Discussionmentioning
confidence: 83%
“…The Support Vector Machine (SVM) represents a supervised machine learning paradigm that endeavors to identify a hyperplane effectively demarcating two classes [33]. Given that there exists an infinite number of hyperplanes capable of precisely segregating the two classes, SVM aims to select the optimal hyperplane by identifying the one with the maximum margin.…”
Section: Support Vector Machine (Svm)mentioning
confidence: 99%
“…SVCs are used for separating a dataset into classes, producing a discrete output (class label). SVCs are widely used for both linear and non-linear data and are considered easily applicable to unseen datasets [105]. Although SVMs have some limitations, such as the inability to identify more than two classes at a time, or the difficulty of applying them to large datasets [106,107], they are still some of the most widely used classification algorithms in a few fields such as image classification and object detection in the field of remote sensing [107,108].…”
Section: Support Vector Machinesmentioning
confidence: 99%
“…The support vectors thus define the distance (also called a "margin") between the two additional hyperplanes. Then, the SVM algorithm selects the classification hyperplane with an orientation that maximizes the margin, ensuring a more accurate classification [102,105,106,110].…”
Section: Support Vector Machinesmentioning
confidence: 99%