2016
DOI: 10.1016/j.neucom.2016.04.059
|View full text |Cite
|
Sign up to set email alerts
|

Computational performance optimization of support vector machine based on support vectors

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2018
2018
2025
2025

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 18 publications
(6 citation statements)
references
References 26 publications
0
6
0
Order By: Relevance
“…In addition to accuracy, area under the ROC curve (i.e., AUC) was also used to measure the performance of the custom shallow CNN with SoftMax and the custom VGG16 with SoftMax. This was done because it considers the entire range of threshold values between 0 and 1 and is not affected by class distribution and misclassification cost 11 , 48 , 49 . The AUC can be treated as a measure of separability, and the lines belonging to a class that reaches close to the top-left corner are the most separable one.…”
Section: Methodsmentioning
confidence: 99%
“…In addition to accuracy, area under the ROC curve (i.e., AUC) was also used to measure the performance of the custom shallow CNN with SoftMax and the custom VGG16 with SoftMax. This was done because it considers the entire range of threshold values between 0 and 1 and is not affected by class distribution and misclassification cost 11 , 48 , 49 . The AUC can be treated as a measure of separability, and the lines belonging to a class that reaches close to the top-left corner are the most separable one.…”
Section: Methodsmentioning
confidence: 99%
“… SVM is a supervised learning algorithm developed in 1992 by Boser, Guyon, and Vapnik based on statistical learning theory. 23 The SVM algorithm has shown strong performance in solving classification and regression problems with small sample sizes, nonlinearity, and high dimensionality. The 2D scatterplot and 3D scatterplot show that the feature subset has some linear divisibility in spatial distribution, so we try to use the SVM algorithm for breast cancer classification and diagnosis task.…”
Section: Methodsmentioning
confidence: 99%
“…Due to limited sample information, we sought the best compromise between complexity of the model and learning ability in order to obtain the best generalization ability. By determining the hyperplane in the input space, data samples are divided into multiple classes to maximize separation between classes (Wang et al, 2016). When SVM is applied to regression fitting analysis, we can identify the optimal classification surface to minimize errors for all training samples from the optimal classification surface.…”
Section: Svm Analysismentioning
confidence: 99%