2014
DOI: 10.1007/s11590-014-0803-1
|View full text |Cite
|
Sign up to set email alerts
|

Structure learning of Bayesian Networks using global optimization with applications in data classification

Abstract: Bayesian Networks are increasingly popular methods of modeling uncertainty in artificial intelligence and machine learning. A Bayesian Network consists of a directed acyclic graph in which each node represents a variable and each arc represents probabilistic dependency between two variables. Constructing a Bayesian Network from data is a learning process that consists of two steps: learning structure and learning parameter. Learning a network structure from data is the most difficult task in this process. This… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 12 publications
(5 citation statements)
references
References 30 publications
0
5
0
Order By: Relevance
“…We compared the proposed algorithm when with several benchmark classifiers [ 12 , 13 , 23 ] that were presented in the literature. The statistical results of all evaluated functions using 20 rounds of 10-fold cross validation are shown in Table 5 .…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…We compared the proposed algorithm when with several benchmark classifiers [ 12 , 13 , 23 ] that were presented in the literature. The statistical results of all evaluated functions using 20 rounds of 10-fold cross validation are shown in Table 5 .…”
Section: Resultsmentioning
confidence: 99%
“…Pernkopf and Bilmes [ 22 ] proposed a greedy heuristic strategy to determine the attribute order by comparing where ranks higher than in the order, i.e., . Taheri et al [ 23 ] proposed to build a dynamic structure without specifying k a priori, and they proved that the resulting BNC is optimal.…”
Section: Bayesian Network and Markov Blanketmentioning
confidence: 99%
“…Jing et al [31] presented the boosted BNC which greedily builds the structure with the arcs having the highest value of CMI. Taheri and Mammadov [32] provided an algorithm for adding arcs by using conditional probability of attributes given their parents. These BNCs are 1-dependence classifiers without a prior attribute order.…”
Section: Prior Workmentioning
confidence: 99%
“…Building a BN from data is called a learning process, and involves two steps: parametric learning and structured learning. [9] Structured learning has been more frequently studied than parametric learning. Common structured learning methods using BNs are the exhaustive method, hill-climbing algorithm, and K2 algorithm.…”
Section: Introductionmentioning
confidence: 99%