2011
DOI: 10.1016/j.ijar.2010.11.006
|View full text |Cite
|
Sign up to set email alerts
|

Join tree propagation utilizing both arc reversal and variable elimination

Abstract: In this paper, we put forth the first join tree propagation algorithm that selectively applies either arc reversal (AR) or variable elimination (VE) to build the propagated messages. Our approach utilizes a recent method for identifying the propagated join tree messagesà priori. When it is determined that precisely one message is to be constructed at a join tree node, VE is utilized to build this distribution; otherwise, AR is applied as it is better suited to construct multiple distributions passed between ne… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2011
2011
2015
2015

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(8 citation statements)
references
References 17 publications
0
8
0
Order By: Relevance
“…Here, X and E are disjoint subsets of U , and E is observed taking value e. We describe a basic algorithm for computing p(X|E = e), called variable elimination (VE), first put forth in [17]. We do not consider alternative approaches to inference such as conditioning [6] and join tree propagation [1,2,10]. Inference involves the elimination of variables.…”
Section: Inferencementioning
confidence: 99%
See 1 more Smart Citation
“…Here, X and E are disjoint subsets of U , and E is observed taking value e. We describe a basic algorithm for computing p(X|E = e), called variable elimination (VE), first put forth in [17]. We do not consider alternative approaches to inference such as conditioning [6] and join tree propagation [1,2,10]. Inference involves the elimination of variables.…”
Section: Inferencementioning
confidence: 99%
“…For example, consider potential ψ(g, i = 1, j) in (2). F (ψ(g, i = 1, j)), the evidence expanded form, can be easily obtained in a recursive manner as follows:…”
Section: Understanding Semanticsmentioning
confidence: 99%
“…Several exact 2,3,4,5,6,7,8 and approximate 9,10,11,12,13 inference algorithms can be found in the literature, due to the fact that it is an NP-hard problem, 14,15 which justifies the study of new techniques and algorithms with the aim of widening the class of affordable problems. Some of the most relevant inference algorithms incorporate the ability of dealing with factorised representations of the potentials that represent the probabilistic information.…”
Section: Bayesian Networkmentioning
confidence: 99%
“…A review of recent literature shows the variety of applications in which they have been successfully used [1,10,24,35,45]. One of the main reasons for using them as the inference engine in a decision support system is that efficient reasoning algorithms can be designed, taking advantage of their structure [2,3,16,44,43,30,29].…”
Section: Introductionmentioning
confidence: 99%