2011
DOI: 10.3233/ia-2011-0026
|View full text |Cite
|
Sign up to set email alerts
|

Sampling-based lower bounds for counting queries

Abstract: It is well known that computing relative approximations of weighted counting queries such as the probability of evidence in a Bayesian network, the partition function of a Markov network, and the number of solutions of a constraint satisfaction problem is NP-hard. In this paper, we settle therefore on an easier problem of computing highconfidence lower bounds and propose an algorithm based on importance sampling and Markov inequality for it. However, a straight-forward application of Markov inequality often yi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
6
0

Year Published

2017
2017
2019
2019

Publication Types

Select...
3

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(6 citation statements)
references
References 40 publications
0
6
0
Order By: Relevance
“…The quality (tightness) of these bounds depends on the variance of the MS algorithm's output. As noted in [27], the MSB bound tends to decrease (unless the variance is very small) when the number of samples grows. The ASB and MASB schemes are based on the assumption that the sample average and maximum are likely to get larger as more samples are drawn.…”
Section: Order Statistics Bound (Osb)mentioning
confidence: 78%
See 3 more Smart Citations
“…The quality (tightness) of these bounds depends on the variance of the MS algorithm's output. As noted in [27], the MSB bound tends to decrease (unless the variance is very small) when the number of samples grows. The ASB and MASB schemes are based on the assumption that the sample average and maximum are likely to get larger as more samples are drawn.…”
Section: Order Statistics Bound (Osb)mentioning
confidence: 78%
“…Namely, to discover a good approximation to each coefficient of G 's IP. Taking into consideration the computational complexity of such a procedure for large graphs, a reasonable approach is to use a probabilistic lower bound, which was first introduced by , and is provided next.…”
Section: Multilevel Splittingmentioning
confidence: 99%
See 2 more Smart Citations
“…The confidence parameter δ for our probabilistic bounds is set to 0.025. In the experiments, we also used Zδ, a (1 − δ) probabilistic lower bound by the Markov inequality (Gogate and Dechter 2011), and switched to our lower bound Z − ∆ when it becomes non-trivial. We also replaced Z + ∆ with the best deterministic upper bound reached so far if the latter is tighter.…”
Section: Empirical Evaluationmentioning
confidence: 99%