Support vector machine (SVM) is a kernel based novel pattern classification method that is significant in many areas like data mining and machine learning. A unique strength is the use of kernel function to map the data into a higher dimensional feature space. In training SVM, kernels and its parameters have very vital role for classification accuracy. Therefore, a suitable kernel design and its parameters should be used for SVM training. In this paper, we present certain kernel functions for multiclass support vector machines and propose the appropriate and optimal kernel for one-versus-one (OAO) and one-versus-all (OAA) multiclass support vector machines. The performance of the one-versus-one and one-versus-all multiclass SVM are illustrated by empirical results and it is evaluated by the parameters like support vectors, support vector percentage, classification error, training error and CPU time. The experimental results demonstrate the ability to use more generalized kernel function and it goes to prove that the polynomial kernel's efficiency in terms of high classification accuracy for several datasets.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.