2014 IEEE International Conference on Control System, Computing and Engineering (ICCSCE 2014) 2014
DOI: 10.1109/iccsce.2014.7072685
|View full text |Cite
|
Sign up to set email alerts
|

A non parametric Partial Histogram Bayes learning algorithm for classification applications

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
6
0

Year Published

2018
2018
2018
2018

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(6 citation statements)
references
References 13 publications
0
6
0
Order By: Relevance
“…Bayesian based technique like NB, GMMC, and FNB are more suitable. In a previous work [27], our proposed PHBayes demonstrated faster and more accurate classification compared to NB, GMMC, 1 st NN, and NCM. PHBayes need not to keep the number of instance in its memory, but it requires a memory space for the probability density of the observed histogram bins.…”
Section: Introductionmentioning
confidence: 71%
See 1 more Smart Citation
“…Bayesian based technique like NB, GMMC, and FNB are more suitable. In a previous work [27], our proposed PHBayes demonstrated faster and more accurate classification compared to NB, GMMC, 1 st NN, and NCM. PHBayes need not to keep the number of instance in its memory, but it requires a memory space for the probability density of the observed histogram bins.…”
Section: Introductionmentioning
confidence: 71%
“…1 illustrates the concept of using the observed histogram as a means to estimate the probability density. Here, two cases are presented using two different probability density functions [27] as shown in Fig. 1a and 1d.…”
Section: Supervised Partial Histogram Bayesmentioning
confidence: 99%
“…This allows fPHBayes to work with large and small number of instances and with large and small variances using large histogram resolution. [37]. The reason of using 64 bins of histogram in fPHBayes is to allow it to cover all ranges of histogram lower than 64 bins.…”
Section: Flexible Partial Histogram Bayes Learning (Fphbayes)mentioning
confidence: 99%
“…In this paper, a new Flexible Partial Histogram Bayes learning algorithm (fPHBayes) is proposed which is an improvement of PHBayes. Compared with PHBayes in our previous work [37], the proposed fPHBayes is more accurate to deal with small and large number of instances, more flexible to the class probability distribution and requires fewer parameters to be considered. fPHBayes uses a probability distribution derived from smoothing the observed histogram and performs the classification using Bayesian rule.…”
Section: Introductionmentioning
confidence: 98%
See 1 more Smart Citation