1990
DOI: 10.1016/b978-0-444-88738-2.50024-5
|View full text |Cite
|
Sign up to set email alerts
|

Simulation Approaches to General Probabilistic Inference on Belief Networks

Abstract: Although a number of algorithms have been developed to solve probabilistic inference problems on belief networks, they can be divided into two main groups: exact techniques which exploit the conditional independence revealed when the graph structure is relatively sparse, and probabilistic sampling techniques which exploit the "conductance" of an embedded Markov chain when the conditional probabilities have non extreme values. In this paper, we investigate a family of Monte Carlo sampling techniques similar to … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
150
0

Year Published

1996
1996
2010
2010

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 182 publications
(152 citation statements)
references
References 13 publications
2
150
0
Order By: Relevance
“…Several approximate algorithms based on stochastic sampling have been developed. Of these, best known are probabilistic logic sampling (Henrion, 1988), likelihood sampling (Shachter & Peot, 1989;Fung & Chang, 1989), and backward sampling (Fung & del Favero, www.intechopen.com 1994), Adaptive Importance Sampling (AISBN) (Cheng & Druzdzel, 2000), and Approximate Posterior Importance Sampling (APIS-BN) (Yuan & Druzdzel, 2003). Approximate belief updating in Bayesian networks has also been shown to be worst case NP-hard (Dagum & Luby, 1993).…”
Section: Bayesian Updatingmentioning
confidence: 99%
“…Several approximate algorithms based on stochastic sampling have been developed. Of these, best known are probabilistic logic sampling (Henrion, 1988), likelihood sampling (Shachter & Peot, 1989;Fung & Chang, 1989), and backward sampling (Fung & del Favero, www.intechopen.com 1994), Adaptive Importance Sampling (AISBN) (Cheng & Druzdzel, 2000), and Approximate Posterior Importance Sampling (APIS-BN) (Yuan & Druzdzel, 2003). Approximate belief updating in Bayesian networks has also been shown to be worst case NP-hard (Dagum & Luby, 1993).…”
Section: Bayesian Updatingmentioning
confidence: 99%
“…2. We develop importance sampling [15,44] with look-ahead, a new, general algorithm for approximate inference. 3.…”
Section: Contributionsmentioning
confidence: 99%
“…If we assert such an observation and wish to estimate its probability, the rejection sampling of fun () -> dcoin_and 10 || fail () offers little help: even with 10,000 attempts, all samples are unsuccessful. Importance sampling [15,44] is an approximate inference strategy that improves upon rejection sampling by assigning weights to each sample. For example, because assigning true to the variable lost leads to failure, the sampler should not pick a value for lost randomly; instead, it should force lost to be false, but scale the weight of the resulting sample by 0.1 to compensate for this artificially induced success.…”
Section: Importance Sampling With Look-aheadmentioning
confidence: 99%
“…Such algorithms, particularly stochastic simulation techniques such as likelihood weighting (Shachter & Peot, 1989;Fung & Chang, 1989), provide "anytime" estimates of the required probabilities. This suits our needs very well: Early in the gradient descent process, we need only very rough estimates of the gradient; the use of an anytime algorithm allows these to be generated very quickly.…”
Section: Derivation Of the Gradient Formulamentioning
confidence: 99%