2015
DOI: 10.3390/e17064134
|View full text |Cite
|
Sign up to set email alerts
|

General and Local: Averaged k-Dependence Bayesian Classifiers

Abstract: The inference of a general Bayesian network has been shown to be an NP-hard problem, even for approximate solutions. Although k-dependence Bayesian (KDB) classifier can construct at arbitrary points (values of k) along the attribute dependence spectrum, it cannot identify the changes of interdependencies when attributes take different values. Local KDB, which learns in the framework of KDB, is proposed in this study to describe the local dependencies implicated in each test instance. Based on the analysis of f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
15
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
6

Relationship

3
3

Authors

Journals

citations
Cited by 9 publications
(15 citation statements)
references
References 18 publications
0
15
0
Order By: Relevance
“…Dependence relations between attributes may vary from one testing instance to another [5]. The conventional KDB structure can not automatically adapt to different testing instances.…”
Section: Tan and Watanmentioning
confidence: 99%
See 4 more Smart Citations
“…Dependence relations between attributes may vary from one testing instance to another [5]. The conventional KDB structure can not automatically adapt to different testing instances.…”
Section: Tan and Watanmentioning
confidence: 99%
“…Hence, classifying a testing instance using LKDB has time complexity O((l + k)n 2 ). By combining the predictions of the robust KDB model and the flexible LKDB model, the averaged KDB (AKDB) achieves even higher classification performance [5].…”
Section: Tan and Watanmentioning
confidence: 99%
See 3 more Smart Citations