Proceedings of the 24th International Conference on Machine Learning 2007
DOI: 10.1145/1273496.1273538
|View full text |Cite
|
Sign up to set email alerts
|

Efficient inference with cardinality-based clique potentials

Abstract: Many collective labeling tasks require inference on graphical models where the clique potentials depend only on the number of nodes that get a particular label. We design efficient inference algorithms for various families of such potentials.Our algorithms are exact for arbitrary cardinality-based clique potentials on binary labels and for max-like and majority-like clique potentials on multiple labels. Moving towards more complex potentials, we show that inference becomes NP-hard even on cliques with homogene… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
32
0

Year Published

2008
2008
2017
2017

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 29 publications
(33 citation statements)
references
References 16 publications
1
32
0
Order By: Relevance
“…operators PRMs, Friedman et al [12] mode [35] mode, count, exists Macskassy & Provost [36] prop Gupta, Diwan, & Sarawagi [21] mode, count McDowell, Gupta, & Aha [39] prop Table 2: A list of systems and the aggregation operators they use to aggregate neighborhood class labels. The systems include probabilistic relational models (PRMs), relational Markov networks (RMNs) and Markov logic networks (MLNs).…”
Section: Approximate Inference Algorithms For Approaches Based On Glomentioning
confidence: 99%
“…operators PRMs, Friedman et al [12] mode [35] mode, count, exists Macskassy & Provost [36] prop Gupta, Diwan, & Sarawagi [21] mode, count McDowell, Gupta, & Aha [39] prop Table 2: A list of systems and the aggregation operators they use to aggregate neighborhood class labels. The systems include probabilistic relational models (PRMs), relational Markov networks (RMNs) and Markov logic networks (MLNs).…”
Section: Approximate Inference Algorithms For Approaches Based On Glomentioning
confidence: 99%
“…The global minimum of any MPF in this (and therefore also the linear) class will generally, ignoring integrability, 8 produce delta function marginal statistics. This can be seen from the fact that a concave function k f k (h k ) defined over a simplex {{h k } : h k ≥ 0, k h k = const} attains a minimum at an extreme point of this simplex.…”
Section: Characterizing Cost Functionsmentioning
confidence: 99%
“…It is well-known that the minimum of (15) can be computed in O(n log n) time [8]. We need to sort values θ i in non-decreasing order, evaluate the cost of n + 1 labellings (0, .…”
Section: Case I: Binary Variablesmentioning
confidence: 99%
See 1 more Smart Citation
“…all algorithms in [49] can be used as soft GAC-propagators. We anticipate that in the future, more global constraints with tractable soft propagators and useful in applications will be discovered.…”
Section: Handling High-arity and Global Constraintsmentioning
confidence: 99%