2008
DOI: 10.1016/j.artint.2007.11.002
|View full text |Cite
|
Sign up to set email alerts
|

On probabilistic inference by weighted model counting

Abstract: A recent and effective approach to probabilistic inference calls for reducing the problem to one of weighted model counting (WMC) on a propositional knowledge base. Specifically, the approach calls for encoding the probabilistic model, typically a Bayesian network, as a propositional knowledge base in conjunctive normal form (CNF) with weights associated to each model according to the network parameters. Given this CNF, computing the probability of some evidence becomes a matter of summing the weights of all C… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
271
0
1

Year Published

2014
2014
2018
2018

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 239 publications
(274 citation statements)
references
References 26 publications
2
271
0
1
Order By: Relevance
“…Concept of probabilistic inference algorithms can be easily understood using the method proposed known as singly connected Bayesian networks (SCBNs) that is a special case of BNs. Concept of probabilistic inference is being used in relation with Bayesian networks by [19].…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Concept of probabilistic inference algorithms can be easily understood using the method proposed known as singly connected Bayesian networks (SCBNs) that is a special case of BNs. Concept of probabilistic inference is being used in relation with Bayesian networks by [19].…”
Section: Methodsmentioning
confidence: 99%
“…Researchers [19] define the semantics of Bayesian networks in a form of joint distribution. A novel method where it is constructed from the probability distributions of DNA sequence.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…These are regularities that arise from using Boolean logic to compactly define potentials (Chavira and Darwiche 2008). They are not exploited by factor-based lifted inference algorithms, such as FOVE (de Salvo Braz et al 2005).…”
Section: Proposition 3 Lifted Weight Learning By First-order Knowledgmentioning
confidence: 99%
“…If the cutset is bounded, then cutset conditioning is tractable. However, note that unlike classic cutset conditioning, cutset networks can take advantage of determinism [3,12] and context-specific independence [2] by allowing different variables to be conditioned on at the same level in the OR search tree. As a result, they can yield a compact representation even if the size of the cutset is arbitrarily large.…”
Section: Introductionmentioning
confidence: 99%