2014 27th SIBGRAPI Conference on Graphics, Patterns and Images 2014
DOI: 10.1109/sibgrapi.2014.36
|View full text |Cite
|
Sign up to set email alerts
|

Learning Kernels for Support Vector Machines with Polynomial Powers of Sigmoid

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 9 publications
0
4
0
Order By: Relevance
“…However, to obtain a good distribution of the complex problem, a training set with a high number of instances is necessary. SVMbased classification uses kernel functions Linear, Radial Base Function kernel(RBF), Polynomial, and Sigmoid [29], [101].…”
Section: Support Vector Machinesmentioning
confidence: 99%
“…However, to obtain a good distribution of the complex problem, a training set with a high number of instances is necessary. SVMbased classification uses kernel functions Linear, Radial Base Function kernel(RBF), Polynomial, and Sigmoid [29], [101].…”
Section: Support Vector Machinesmentioning
confidence: 99%
“…) comprise the pair of input and output vectors, w is the weight factor, b is the threshold value, C is the penalty parameter, kernel function Φ is used to map input samples to higher space, ε is the epsilon parameter, and the upper and lower training errors are i ξ and i ξ * , respectively. For the kernel function, there are three typical kernel functions which are RBF, polynomial function and linear function [22][23][24]. These kernel functions are used to map nonlinear between the input and feature space.…”
Section: Support Vector Regressionmentioning
confidence: 99%
“…The sigmoid kernel is mostly known with the neural networks, but it is still a useful kernel that could be used with SVM to work out many applications [14,15]. The kernel function could be represented as follows:…”
Section: Sigmoid Kernelmentioning
confidence: 99%
“…Multi-scale spectral, size, shape, and texture information are used for classification. There are a low number of training samples for each class (14)(15)(16)(17)(18)(19)(20)(21)(22)(23)(24)(25)(26)(27)(28)(29)(30) and a high number of classification features (148). the source of data is Institute for Global Environmental Strategies; 2108-11 Kamiyamaguchi, Hayama, Kanagawa, 240-0115 Japan [24].…”
Section: Data Setsmentioning
confidence: 99%