1997
DOI: 10.1023/a:1007413511361
|View full text |Cite
|
Sign up to set email alerts
|

Untitled

Abstract: Abstract. The simple Bayesian classifier is known to be optimal when attributes are independent given the class, but the question of whether other sufficient conditions for its optimality exist has so far not been explored. Empirical results showing that it performs surprisingly well in many domains containing clear attribute dependences suggest that the answer to this question may be positive. This article shows that, although the Bayesian classifier's probability estimates are only optimal under quadratic lo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

4
225
0
5

Year Published

2007
2007
2016
2016

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 2,440 publications
(234 citation statements)
references
References 23 publications
4
225
0
5
Order By: Relevance
“…Although perhaps not recognised when the approach was first introduced, substructural analysis is an example of a naive Bayesian classifier (NBC) [17,63,64]. An NBC is a simple classification algorithm that is based on the use of Bayes' theorem and on strong assumptions as to the statistical independence of the descriptors characterising the objects that are to be classified.…”
Section: Substructural Analysis Naive Bayesian Classifiers and Groupmentioning
confidence: 99%
See 1 more Smart Citation
“…Although perhaps not recognised when the approach was first introduced, substructural analysis is an example of a naive Bayesian classifier (NBC) [17,63,64]. An NBC is a simple classification algorithm that is based on the use of Bayes' theorem and on strong assumptions as to the statistical independence of the descriptors characterising the objects that are to be classified.…”
Section: Substructural Analysis Naive Bayesian Classifiers and Groupmentioning
confidence: 99%
“…An NBC is a simple classification algorithm that is based on the use of Bayes' theorem and on strong assumptions as to the statistical independence of the descriptors characterising the objects that are to be classified. The use of "naive" arises from the independence assumptions: these are often demonstrably incorrect, but this has not seemed to affect the performance of the classifier in many application domains [63].…”
Section: Substructural Analysis Naive Bayesian Classifiers and Groupmentioning
confidence: 99%
“…Four different machine learning methods were used for data analysis: K Nearest Neighbours (K-NN) [15], Naïve Bayes [16], C4.5 Decision Tree [17,18], Support Vector Machine (SVM) [19]. The selected algorithms follow distinct approaches in model development.…”
Section: Methodsmentioning
confidence: 99%
“…Sin embargo, cuando se trata de problemas reales, muchas veces esta suposición no se cumple. En algunas ocasiones, no obstante, el paradigma bayesiano ingenuo ha demostrado un buen funcionamiento (12,13). El clasificador bayesiano ingenuo aplica el teorema de Bayes con el fin de predecir para cada instancia x la clase c∈C, con la máxima probabilidad a posteriori.…”
Section: Materiales Y Métodosunclassified