2010
DOI: 10.15837/ijccc.2010.3.2487
|View full text |Cite
|
Sign up to set email alerts
|

Improving a SVM Meta-classifier for Text Documents by using Naive Bayes

Abstract: Text categorization is the problem of classifying text documents into a set of predefined classes. In this paper, we investigated two approaches: a) to develop a classifier for text document based on Naive Bayes Theory and b) to integrate this classifier into a meta-classifier in order to increase the classification accuracy. The basic idea is to learn a meta-classifier to optimally select the best component classifier for each data point. The experimental results show that combining classifiers can significan… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
7
0
1

Year Published

2012
2012
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(8 citation statements)
references
References 8 publications
0
7
0
1
Order By: Relevance
“…In our previous work [9] and [11] it is presented a meta-classifier, based on 8 SVM classifiers and one Bayes classifier that were used to improve the classification accuracy for text documents. In those works 3 meta-classifier models are used: majority vote, selection based on the Euclidian distance and selection based on the cosine angle.…”
Section: Meta-classifier Modelsmentioning
confidence: 99%
See 1 more Smart Citation
“…In our previous work [9] and [11] it is presented a meta-classifier, based on 8 SVM classifiers and one Bayes classifier that were used to improve the classification accuracy for text documents. In those works 3 meta-classifier models are used: majority vote, selection based on the Euclidian distance and selection based on the cosine angle.…”
Section: Meta-classifier Modelsmentioning
confidence: 99%
“…All meta-classifiers presented in this article contain eight SVM type classifiers and one Naïve Bayes classifier [11].…”
Section: Meta-classifier Modelsmentioning
confidence: 99%
“…Third, SVM is built on the structural risk minimization principle rather than the empirical risk minimization principle, which can minimize the upper bound of the generalization error. Because of its outstanding performance, SVM has already reached many achievements in classification problems [4][5][6][7].…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, compared with other machine learning methods, such as artificial neural network [2,3,4], SVM owns better generalization ability. Within a few years after its introduction SVM has played excellent performance on many real-world predictive data mining applications such as text categorization [5], time series prediction [6], pattern recognition [7] and image processing [8], etc.…”
Section: Introductionmentioning
confidence: 99%