2006
DOI: 10.1007/11766247_16
|View full text |Cite
|
Sign up to set email alerts
|

An Improved LAZY-AR Approach to Bayesian Network Inference

Abstract: Abstract. We propose LAZY arc-reversal with variable elimination (LAZY-ARVE) as a new approach to probabilistic inference in Bayesian networks (BNs). LAZY-ARVE is an improvement upon LAZY arcreversal (LAZY-AR), which was very recently proposed and empirically shown to be the state-of-the-art method for exact inference in discrete BNs. The primary advantage of LAZY-ARVE over LAZY-AR is that the former only computes the actual distributions passed during inference, whereas the latter may perform unnecessary comp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2009
2009
2014
2014

Publication Types

Select...
3
2
1

Relationship

2
4

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 14 publications
0
5
0
Order By: Relevance
“…The messages we identify are precisely those that will be propagated, provided that AR is chosen as the algorithm for building messages. However, (Butz and Hua 2006) pointed out that during its execution distributions can be built that will not be passed as messages, nor will they be needed in the construction of the distributions to be passed as messages. The reason is that in the elimination of a variable v i , AR constructs 3k − 1 distributions and outputs k of them, where k is the number of children of v i with respect to the distributions at the sending join tree node.…”
Section: Advantagesmentioning
confidence: 99%
“…The messages we identify are precisely those that will be propagated, provided that AR is chosen as the algorithm for building messages. However, (Butz and Hua 2006) pointed out that during its execution distributions can be built that will not be passed as messages, nor will they be needed in the construction of the distributions to be passed as messages. The reason is that in the elimination of a variable v i , AR constructs 3k − 1 distributions and outputs k of them, where k is the number of children of v i with respect to the distributions at the sending join tree node.…”
Section: Advantagesmentioning
confidence: 99%
“…Our approach uses VE to build distributions in computer memory. AR is not well suited for message construction in prioritized join tree propagation, because it constructs a new probability distribution for every child of the variable being eliminated [3]. As some of these new distributions can be deemed relevant and the rest irrelevant, we instead call VE to build the relevant distributions as needed.…”
Section: Physical Construction Of Messagesmentioning
confidence: 99%
“…It is also worth contrasting our work here with our previous work. In [3], we suggest applying the VE algorithm to build the messages in a Lazy AR schedule rather than applying the AR algorithm to construct the messages as Lazy AR does. We explicitly demonstrated that the AR algorithm can build probability distributions that will not be passed as messages, nor are they needed in the construction of the messages that will be passed.…”
Section: Examplementioning
confidence: 99%
“…Given an input query p(X|Y = y), our system computes the posterior probabilities of the variables in X using our own inference algorithm, which is based on the recent work in [3]. More specifically, our algorithm is a hybrid approach utilizing join tree propagation to guide the inference procedure.…”
Section: A Prototype Interfacementioning
confidence: 99%
“…Of the three hybrid approaches tested, Lazy arc-reversal (Lazy-AR) was empirically shown to be the state-of-the-art method for exact inference in discrete BNs [7]. In [3], we proposed Lazy arc-reversal with variable elimination (Lazy-ARVE) as a new approach to BN inference and illustrated its benefits over Lazy-AR.…”
Section: Introductionmentioning
confidence: 99%