2002
DOI: 10.1016/s0004-3702(02)00191-1
|View full text |Cite
|
Sign up to set email alerts
|

Learning Bayesian networks from data: An information-theory based approach

Abstract: This paper provides algorithms that use an information-theoretic analysis to learn Bayesian network structures from data. Based on our three-phase learning framework, we develop efficient algorithms that can effectively learn Bayesian networks, requiring only polynomial numbers of conditional independence (CI) tests in typical cases. We provide precise conditions that specify when these algorithms are guaranteed to be correct as well as empirical evidence (from real world applications and simulation tests) tha… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
506
0
10

Year Published

2002
2002
2024
2024

Publication Types

Select...
4
3
2

Relationship

1
8

Authors

Journals

citations
Cited by 728 publications
(530 citation statements)
references
References 27 publications
0
506
0
10
Order By: Relevance
“…We can make a comparison 3 between the two steps method proposed and the single searching process. For that purpose we have used a constraint based algorithm, BN Power Constructor (BNPC) [11] (we use the software package available at http://www.cs.ualberta.ca/˜jcheng /bnsoft.htm ). In Table 2 we show the results obtained by the BNPC algorithm which are worse than those obtained by any of the different orderings θ, used as entry to the algorithms BENEDICT-step and K2.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…We can make a comparison 3 between the two steps method proposed and the single searching process. For that purpose we have used a constraint based algorithm, BN Power Constructor (BNPC) [11] (we use the software package available at http://www.cs.ualberta.ca/˜jcheng /bnsoft.htm ). In Table 2 we show the results obtained by the BNPC algorithm which are worse than those obtained by any of the different orderings θ, used as entry to the algorithms BENEDICT-step and K2.…”
Section: Resultsmentioning
confidence: 99%
“…The number and complexity of the tests are critical for the efficiency and reliability of these methods. Some of the algorithms based on this approach can be found in [10,11,16].…”
Section: Learning Belief Networkmentioning
confidence: 99%
“…Comparing to classical statistical approaches, Bayesian belief networks have a distinct advantage [15]. BBN becomes not only a powerful tool for knowledge representation but reasoning under conditions of uncertainty [16], frequently dealing with real-world problems such as building medical diagnostic systems, forecasting, and manufacturing process control for several decades [17]. Nowadays BBN has been extended to other applications including software risk management [18], ecosystem and environmental management [19], and transportation [20].…”
Section: Bayesian Belief Networkmentioning
confidence: 99%
“…In a comparative study by [23], authors identified some currently used structure learning algorithms, namely PC [60] or IC/IC * [50] (causality search using statistical tests to evaluate conditional independence), BN Power Constructor (BNPC) [11] (also uses conditional independence tests) and other methods based on scoring criterion, such as Minimal weight spanning tree (MWST) [16] (intelligent weighting of the edges and application of the well-known algorithms for the problem of the minimal weight tree), K2 [18] (maximisation of P(G|D) using Bayes and a topological order on the nodes), Greedy search [12] (finding the best neighbour and iterate) or SEM [24] (extension of the EM meta-algorithm to the structure learning problem). However that may be, the problem of learning an optimal Bayesian network from a given dataset is NP-hard [13].…”
Section: Parameter and Structure Learningmentioning
confidence: 99%