2017
DOI: 10.3390/e19090501
|View full text |Cite
|
Sign up to set email alerts
|

Attribute Value Weighted Average of One-Dependence Estimators

Abstract: Abstract:Of numerous proposals to improve the accuracy of naive Bayes by weakening its attribute independence assumption, semi-naive Bayesian classifiers which utilize one-dependence estimators (ODEs) have been shown to be able to approximate the ground-truth attribute dependencies; meanwhile, the probability estimation in ODEs is effective, thus leading to excellent performance. In previous studies, ODEs were exploited directly in a simple way. For example, averaged one-dependence estimators (AODE) weaken the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
15
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 29 publications
(15 citation statements)
references
References 21 publications
0
15
0
Order By: Relevance
“…The ensemble classification algorithms include adaptive boosting (AdaBoost) [ 30 ], bootstrap aggregating (bagging) [ 31 ], and random forest [ 32 , 33 ]. The algorithm extensions include hidden naïve Bayes (HNB) and aggregating one-dependence estimators (AODE) [ 19 – 22 ].…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The ensemble classification algorithms include adaptive boosting (AdaBoost) [ 30 ], bootstrap aggregating (bagging) [ 31 ], and random forest [ 32 , 33 ]. The algorithm extensions include hidden naïve Bayes (HNB) and aggregating one-dependence estimators (AODE) [ 19 – 22 ].…”
Section: Methodsmentioning
confidence: 99%
“…In principle, ML has four steps: problem definition, data collection and preparation, model building, and model prediction. There are 11 state-of-the-art machine learning techniques [ 18 22 ], namely, logistic regression (LR), k-nearest neighbor (kNN), support vector machine (SVM), naïve Bayes (NB), Bayesian network (BN), decision tree (C4.5), AdaBoosting, bagging, random forest (RF), hidden naïve Bayes (HNB), and aggregating one-dependence estimators (AODE).…”
Section: Introductionmentioning
confidence: 99%
“…Bayesian network (BN) [1]- [4] provides a powerful tool for knowledge representation and inference under conditions of uncertainty. Since the 1990s, the study of Bayesian network classifier (BNC) for classification has attracted tremendous attention after the success of Naive Bayes (NB) [5]- [8].…”
Section: Introductionmentioning
confidence: 99%
“…• Model weighting [4], [17] computes the weight associated with each SPODE to combine their probability estimates linearly.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation