2021
DOI: 10.31284/j.jasmet.2021.v2i1.1467
|View full text |Cite
|
Sign up to set email alerts
|

Support Vector Machine optimization with fractional gradient descent for data classification

Abstract: Data classification has several problems one of which is a large amount of data that will reduce computing time. SVM is a reliable linear classifier for linear or non-linear data, for large-scale data, there are computational time constraints. The Fractional gradient descent method is an unconstrained optimization algorithm to train classifiers with support vector machines that have convex problems. Compared to the classic integer-order model, a model built with fractional calculus has a significant advantage … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 9 publications
0
5
0
Order By: Relevance
“…Three supervised learning models were employed in the present study: K-nearest neighbor (KNN), support vector machine (SVM) and logistic regression (LR). KNN is a basic classification and regression method ( 24 ). SVM is a generalized linear classifier for binary classification of data according to supervised learning, with decision boundary being the maximum margin hyperplane for learning samples ( 25 ).…”
Section: Methodsmentioning
confidence: 99%
“…Three supervised learning models were employed in the present study: K-nearest neighbor (KNN), support vector machine (SVM) and logistic regression (LR). KNN is a basic classification and regression method ( 24 ). SVM is a generalized linear classifier for binary classification of data according to supervised learning, with decision boundary being the maximum margin hyperplane for learning samples ( 25 ).…”
Section: Methodsmentioning
confidence: 99%
“…Two more applications of fractional gradient descent are found in [64,65]. In these publications, the researchers combined fractional gradient descent with support vector machines for two classic datasets, i.e., the Iris and the Rainfall datasets.…”
Section: Fractional Gradient-based Optimizationmentioning
confidence: 99%
“…Here, we want to point out two exemplary applications by Hapsari et al [64,65] on wellknown test-datasets, i.e., the Iris and the rainfall dataset, for the gradient-based optimization of machine learning algorithms.…”
Section: Optimizationmentioning
confidence: 99%
“…Two more applications of fractional gradient descent are found in [63] and [64]. In these publications, the researchers combine fractional gradient descent with support vector machines for two classic data sets, i.e., the Iris and the Rainfall data set.…”
Section: Fractional Gradient Based Optimizationmentioning
confidence: 99%
“…Here we want to point out two exemplary applications by Hapsari et al [63,64] on well-known test-data sets, i.e., the Iris and the rainfall data set, for the gradient-based optimization of machine learning algorithms.…”
Section: Optimizationmentioning
confidence: 99%