2021
DOI: 10.3390/e23121703
|View full text |Cite
|
Sign up to set email alerts
|

Exact Learning Augmented Naive Bayes Classifier

Abstract: Earlier studies have shown that classification accuracies of Bayesian networks (BNs) obtained by maximizing the conditional log likelihood (CLL) of a class variable, given the feature variables, were higher than those obtained by maximizing the marginal likelihood (ML). However, differences between the performances of the two scores in the earlier studies may be attributed to the fact that they used approximate learning algorithms, not exact ones. This paper compares the classification accuracies of BNs with a… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
29
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 22 publications
(29 citation statements)
references
References 29 publications
0
29
0
Order By: Relevance
“…Parts of this research were reported in an earlier conference paper published by Sugahara et al (2018).…”
Section: Acknowledgmentsmentioning
confidence: 97%
“…Parts of this research were reported in an earlier conference paper published by Sugahara et al (2018).…”
Section: Acknowledgmentsmentioning
confidence: 97%
“…Therefore, the model can be useful for an entity (expert, broker, or bidder) to determine relevant improvement actions to improve currently existing products. The proposed model was developed by combining these instruments: SMART(-ER) method (S-specific, M-measurable, A-achievable, R-relevant, realistic, or reward, T-'based on timeline' or timebound, E-exciting or evaluated, R-recorded or reward) [32], brainstorming method (BM) [33], survey with Likert scale [1,[34][35][36], Weighted Sum Model (WSM) [3,[37][38][39], relative states scale [3,4,40], and Naïve Bayesian Classifier (NBC) [41][42][43][44]. The general concept of the model is shown in Figure 1.…”
Section: Concept Of Modelmentioning
confidence: 99%
“…Furthermore, the Naive Bayesian Classifier (NBC) is effective in predicting or classifying any data shown in qualitative and quantitative ways [41][42][43][44]. Despite that, the universality of the model is a possibility of its application as part of the ongoing improvement of products, and to the sustainable development of products considering customers' expectations.…”
mentioning
confidence: 99%
“…Therefore, there are no edges between the attributes, as shown in the example in Figure 1. Improvements to the naive Bayes include feature weighting techniques [32,33], boosted parameter learning [34], and the exact learning augmented naive Bayes classifier [35]. Whereas TAN starts off by considering a fully connected network where the edges have weights, these are computed using the conditional mutual information between pairs of attributes.…”
Section: Background 21 Bayesian Networkmentioning
confidence: 99%