New Trends in Applied Artificial Intelligence
DOI: 10.1007/978-3-540-73325-6_75
|View full text |Cite
|
Sign up to set email alerts
|

A New Multi-class Support Vector Machine with Multi-sphere in the Feature Space

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
25
0

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 25 publications
(25 citation statements)
references
References 6 publications
0
25
0
Order By: Relevance
“…In order to apply one-class SVM for multiclass problem, multiple models of one-class SVM will be trained together. Hao and Lin [11] formulated multiple one-class SVM models as one large optimization problem to minimize the total errors, while Refs. [12,13] used separate one-class SVM models and iteratively adjusted the threshold of each models to maximize the total accuracy.…”
Section: One-class Svmmentioning
confidence: 99%
“…In order to apply one-class SVM for multiclass problem, multiple models of one-class SVM will be trained together. Hao and Lin [11] formulated multiple one-class SVM models as one large optimization problem to minimize the total errors, while Refs. [12,13] used separate one-class SVM models and iteratively adjusted the threshold of each models to maximize the total accuracy.…”
Section: One-class Svmmentioning
confidence: 99%
“…SVDD has parameters that control the margin error. Two approaches based on SVDD were proposed in [8,9]. The concept behind these approaches is to determine one SVDD for each class of the training set and tunes its parameter to improve the accuracy of each classifier.…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, these approaches resemble more the optimal Bayesian classifier where the optimal decision is slightly deviated to the class with the smaller variance. The method proposed in [8] trains each SVDD separately by solving an optimization problem while the one proposed in [9] trains all the SVDD by formulating one optimization problem.…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, one should infer a classifier from a more or less limited set of training examples. In the classical decision framework, referred as the classical framework, many historical strands of research can be identified: statistical, Support Vector Machines, Neural Network Bishop (2006); Guobin & Lu (2007); Hao & Lin (2007) ;Husband & Lin (2002); Vapnik (1998) ;Yang et al (2007)... In the class-selective rejection scheme, fewer works have been done Ha (1997); Horiuchi (1998).…”
Section: Introductionmentioning
confidence: 99%
“…Taking advantage of the regularization path method, the entire parameters searching space is considered. Compared to similar approaches Bottou et al (1994); Hao & Lin (2007); Yang et al (2007), since the searching space is widely extended, the selected decision rule is more likely to be the optimal one. Note that standard multiclass learning strategy is a particular case of the proposed approaches where the different decision options are given by the pre-defined classes, the loss function is given by the error rate and no constraint is considered.…”
Section: Introductionmentioning
confidence: 99%