2009
DOI: 10.1016/j.ijar.2008.08.008
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian classifiers based on kernel density estimation: Flexible classifiers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
52
0
1

Year Published

2012
2012
2021
2021

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 130 publications
(53 citation statements)
references
References 32 publications
0
52
0
1
Order By: Relevance
“…This information loss can be avoided without losing most of the advantages of a discretized NB replacing it by a FNB, where the recruitment is discrete and predictors continuous. A 'flexible naive Bayes' classifier consists of the 'multinomial naive Bayes' classifier, supported by the 'kernel-based Bayesian network' paradigm proposed in Pérez et al (2009), which are based upon a nonparametric density estimation technique, 'kernel density estimation' (Silverman, 1986). This means that the classifier is built by aggregating a mixture of kernel avoiding any assumption such as normality.…”
Section: Supervised Classification Based Methodologymentioning
confidence: 99%
See 1 more Smart Citation
“…This information loss can be avoided without losing most of the advantages of a discretized NB replacing it by a FNB, where the recruitment is discrete and predictors continuous. A 'flexible naive Bayes' classifier consists of the 'multinomial naive Bayes' classifier, supported by the 'kernel-based Bayesian network' paradigm proposed in Pérez et al (2009), which are based upon a nonparametric density estimation technique, 'kernel density estimation' (Silverman, 1986). This means that the classifier is built by aggregating a mixture of kernel avoiding any assumption such as normality.…”
Section: Supervised Classification Based Methodologymentioning
confidence: 99%
“…40.9 ± 4.9 42.9 ± 7.8 44.7 ± 11.2 Albacore 58.1 ± 5.8 61.4 ± 6.9 34.6 ± 5.1 Blue whiting 51.3 ± 7.6 52.5 ± 7.8 43.9 ± 8.1 Table 4 Brier score comparison between the pipeline without 'missing imputation' as well as with a 'multinomial naive Bayes' classifier, with 'missing imputation' as well as with a 'multinomial naive Bayes' classifier and with 'missing imputation' as well as replacing the 'multinomial naive Bayes' by a 'flexible naive Bayes'. Pérez et al (2009) and their implementation is available on request from this author.…”
Section: Performance Estimationmentioning
confidence: 99%
“…Classifier Training Flexible naïve Bayes was used in this section (Perez, Larranaga, and Inza, 2009). The classifier training consisted of estimating the probability density function for each object and background color feature.…”
Section: B Calculation Of Color Featurementioning
confidence: 99%
“…Pérez et al [50] have recently proposed a new approach for Flexible Bayesian classifiers based on kernel density estimation that extends the FNBC proposed by [42] in order to handle dependent attributes and abandons the independence assumption. In this work, three classifiers: tree-augmented naive Bays, a kdependence Bayesian classifier and a complete graph are adapted to the support kernel Bayesian network paradigm.…”
Section: Appendix: Naive Bayesian Classifiersmentioning
confidence: 99%