Proceedings of the Eighth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining 2002
DOI: 10.1145/775047.775122
|View full text |Cite
|
Sign up to set email alerts
|

Scaling multi-class support vector machines using inter-class confusion

Abstract: Support vector machines (SVMs) excel at two-class discriminative learning problems. They often outperform generative classifiers, especially those that use inaccurate generative models, such as the naive Bayes (NB) classifier. On the other hand, generative classifiers have no trouble in handling an arbitrary number of classes efficiently, and NB classifiers train much faster than SVMs owing to their extreme simplicity. In contrast, SVMs handle multi-class problems by learning redundant yes/no (one-vs-others) c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
34
0

Year Published

2003
2003
2019
2019

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 39 publications
(34 citation statements)
references
References 9 publications
0
34
0
Order By: Relevance
“…The application of the earliest machine learning method is naive Bayes (NB) [3,4]. Subsequently, almost all the important machine learning algorithms have been applied to the field of text classification, for example, nearest neighbor (KNN), neural network (NN), support vector machine (SVM), decision tree, kernel learning, and some others [5][6][7][8][9][10]. SVM uses the shallow linear model to separate the objective.…”
Section: Text Classificationmentioning
confidence: 99%
“…The application of the earliest machine learning method is naive Bayes (NB) [3,4]. Subsequently, almost all the important machine learning algorithms have been applied to the field of text classification, for example, nearest neighbor (KNN), neural network (NN), support vector machine (SVM), decision tree, kernel learning, and some others [5][6][7][8][9][10]. SVM uses the shallow linear model to separate the objective.…”
Section: Text Classificationmentioning
confidence: 99%
“…Godbole et al [14] propose a new multi-class classification technique that exploits the accuracy of SVMs and the speed of Naive Bayes. It uses a Naive Bayes classifier to compute a confusion matrix quickly.…”
Section: Related Workmentioning
confidence: 99%
“…This categorization problem is usually viewed as supervised learning, where the gaol is to assign predefined category labels to unlabeled documents based on the likelihood inferred from the training set of labeled documents. Numerous approaches have been applied, including Bayesian probabilistic approaches [20,31], nearest neighbor [22,19], neural networks [33], decision trees [2], inductive rule learning [4,9], support vector machines [18,14], Maximum Entropy [26], boosting [28], and linear discriminate projection [3] (see [34] for comparative studies of text categorization methods).…”
Section: Introductionmentioning
confidence: 99%
“…It has been widely applied in many domains of pattern recognition, such as character recognition [10], age estimation [11] and face recognition [12]. The multistage classification methodology proposed in [13] aims at reducing mutual misclassifications among easily confused labels and consists of two stages. Firstly, an inexpensive classifier, which can discriminate among all labels, coarsely classifies a testing instance.…”
Section: Introductionmentioning
confidence: 99%