2005
DOI: 10.1016/j.neucom.2004.09.003
|View full text |Cite
|
Sign up to set email alerts
|

Boosting Naı¨ve Bayes text classification using uncertainty-based selective sampling

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2008
2008
2017
2017

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 17 publications
(2 citation statements)
references
References 8 publications
0
2
0
Order By: Relevance
“…Bayesian classification is a probabilistic method [9], and it is optimal in the sense that it minimizes the expected cost of misclassification [22]. The Naïve Bayesian classifier is a simplified Bayesian classifier [5], and it has been used successfully in many applications [11,13]. Let C j , 1 6 j 6 5 denote the classes of share price trends, and L 1 denote the end time of training.…”
Section: Training the Naïve Bayesian Classifiermentioning
confidence: 99%
“…Bayesian classification is a probabilistic method [9], and it is optimal in the sense that it minimizes the expected cost of misclassification [22]. The Naïve Bayesian classifier is a simplified Bayesian classifier [5], and it has been used successfully in many applications [11,13]. Let C j , 1 6 j 6 5 denote the classes of share price trends, and L 1 denote the end time of training.…”
Section: Training the Naïve Bayesian Classifiermentioning
confidence: 99%
“…Traditional data representation, classification and retrieval problems usually focus on single modal data [3], [4]. For example, for the problem of text classification, we usually only consider using a data set of text to train a classifier [5], [6]. While for the problem of image representation, only the images are considered to learn the representation parameters [7], [8].…”
Section: Introductionmentioning
confidence: 99%