2019
DOI: 10.1609/aaai.v33i01.33017825
|View full text |Cite
|
Sign up to set email alerts
|

Exact and Approximate Weighted Model Integration with Probability Density Functions Using Knowledge Compilation

Abstract: Weighted model counting has recently been extended to weighted model integration, which can be used to solve hybrid probabilistic reasoning problems. Such problems involve both discrete and continuous probability distributions. We show how standard knowledge compilation techniques (to SDDs and d-DNNFs) apply to weighted model integration, and use it in two novel solvers, one exact and one approximate solver. Furthermore, we extend the class of employable weight functions to actual probability density functions… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
13
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
1
1

Relationship

2
6

Authors

Journals

citations
Cited by 17 publications
(13 citation statements)
references
References 19 publications
0
13
0
Order By: Relevance
“…However, little attention has been given to the design of algorithms for structure learning of hybrid SRL models. The same is true for works on hybrid probabilistic programming (HProbLog, Gutmann et al 2010), (DC, Gutmann et al 2011;Nitti et al 2016a), (Extended-Prism, Islam et al 2012), (Hybrid-cplint, Alberti et al 2017), (Michels et al 2016), (BLOG, Wu et al 2018), (Dos Martires et al 2019). Closest to our work is the work on hybrid relational dependency networks (HRDNs, Ravkic et al 2015), for which structure learning was also studied, but this learning algorithm assumes that the data is fully observed.…”
Section: Introductionmentioning
confidence: 89%
“…However, little attention has been given to the design of algorithms for structure learning of hybrid SRL models. The same is true for works on hybrid probabilistic programming (HProbLog, Gutmann et al 2010), (DC, Gutmann et al 2011;Nitti et al 2016a), (Extended-Prism, Islam et al 2012), (Hybrid-cplint, Alberti et al 2017), (Michels et al 2016), (BLOG, Wu et al 2018), (Dos Martires et al 2019). Closest to our work is the work on hybrid relational dependency networks (HRDNs, Ravkic et al 2015), for which structure learning was also studied, but this learning algorithm assumes that the data is fully observed.…”
Section: Introductionmentioning
confidence: 89%
“…Unfortunately, however, by giving up the propositional language, and in particular, the use of arithmetic circuits, we loose the tractability results and unified evaluation scheme offered by AMC. As mentioned, in limited settings, continuous properties can nonetheless be accorded to the atomic propositions, as considered in [40,84].…”
Section: Statistical Modelingmentioning
confidence: 99%
“…The main limitation of the AMC proposal is that the underlying language is propositional logic, and so the semantics is that of classical logic and computations are essentially defined over discrete spaces. It is, however, possible to accord continuous properties to the underlying propositions [84], which may be sufficient in some cases, but it does not allow for arbitrary arithmetical reasoning. So, the AMC proposal offers an attractive generic inference scheme in the propositional setting.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…WMC is one of the state-of-the-art approaches for inference in many discrete probabilistic models. Existing general techniques for exact WMI include DPLL-based search with numerical [3,25,26] or symbolic integration [12] and compilation-based algorithms [19,32].…”
Section: Related Workmentioning
confidence: 99%