2011
DOI: 10.1007/978-3-642-23786-7_52
|View full text |Cite
|
Sign up to set email alerts
|

Constraint Propagation for Efficient Inference in Markov Logic

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2012
2012
2020
2020

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 4 publications
0
4
0
Order By: Relevance
“…Finally, the soft constraints S are grounded much more efficiently by taking frozen atoms into account. Our approach may also be seen as an extension of a proposal by Papai et al (2011).…”
Section: Boosting Inference Efficiencymentioning
confidence: 97%
“…Finally, the soft constraints S are grounded much more efficiently by taking frozen atoms into account. Our approach may also be seen as an extension of a proposal by Papai et al (2011).…”
Section: Boosting Inference Efficiencymentioning
confidence: 97%
“…Lifted LBP Another promising research area that has been recently explored seeks to improve the scalability of LBP on models that feature large networks. Here, mainstream work attempts to exploit some structural properties in the network like symmetry (Ahmadi et al 2013), determinism (Papai et al 2011;Ibrahim et al 2015), sparseness (Poon et al 2008), and type hierarchy (Kiddon and Domingos 2011) to scale LBP inference. For instance, Lifted Inference either directly operates on the first-order structure or uses the symmetry present in the structure of the network to reduce its size (e.g., Ahmadi et al 2013).…”
Section: Related Workmentioning
confidence: 99%
“…Current Markov logic solvers take advantage of the underlying logical structure to perform more powerful optimizations, such as Alchemy's lifted inference in belief propagation and MC-SAT (Poon & Domingos, 2006). Additionally, domain pruning, where one uses hard constraints to infer reduced domains for predicates, has been shown to lead to significant speed-ups (Papai, Singla, & Kautz, 2011).…”
Section: Related Workmentioning
confidence: 99%