Uncertainty Proceedings 1994 1994
DOI: 10.1016/b978-1-55860-332-5.50034-1
|View full text |Cite
|
Sign up to set email alerts
|

Backward Simulation in Bayesian Networks

Abstract: Backward simulation is an approximate inference technique for Bayesian belief networks. It differs from existing simulation methods in that it starts simulation from the known evidence and works backward (i.e., contrary to the direction of the arcs). The technique's focus on the evidence leads to improved convergence in situations where the posterior beliefs are dominated by the evidence rather than by the prior probabilities. Since this class of situations is large, the technique may make practical the applic… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
27
0
1

Year Published

1997
1997
2021
2021

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 36 publications
(28 citation statements)
references
References 12 publications
0
27
0
1
Order By: Relevance
“…A variety of Monte Carlo algorithms have been developed (see Neal, 1993) and applied to the inference problem in graphical models (Dagum & Luby, 1993;Fung & Favero, 1994;Gilks, Thomas, & Spiegelhalter, 1994;Jensen, Kong, & Kjaerulff, 1995;Pearl, 1988). Advantages of these algorithms include their simplicity of implementation and theoretical guarantees of convergence.…”
Section: P(h | E) = P(h E) P(e)mentioning
confidence: 99%
“…A variety of Monte Carlo algorithms have been developed (see Neal, 1993) and applied to the inference problem in graphical models (Dagum & Luby, 1993;Fung & Favero, 1994;Gilks, Thomas, & Spiegelhalter, 1994;Jensen, Kong, & Kjaerulff, 1995;Pearl, 1988). Advantages of these algorithms include their simplicity of implementation and theoretical guarantees of convergence.…”
Section: P(h | E) = P(h E) P(e)mentioning
confidence: 99%
“…A variety o f M o n te Carlo algorithms have been developed see MacKay, this volume, and Neal, 1993 and applied to the inference problem in graphical models Dagum & Luby, 1993;Fung & Favero, 1994;Gilks, Thomas, & Spiegelhalter, 1994;Jensen, Kong, & Kj rul , 1995;Pearl, 1988 . Advantages of these algorithms include their simplicity of implementation and theoretical guarantees of convergence.…”
Section: Introductionmentioning
confidence: 99%
“…Randomly sample from Cat according to its prior probability (say, we choose False). There are a bewilderingly large variety of variants of this scheme, both in the uncertainty in AI literature (Fung 1994;Shachter 1989;Henrion 1988;Pearl 1987) and in the traditional statistical literature. All suffer from three problems: (1) the basic approximation problem (for example, the problem of determining if a probability is less than a specified value) is NP-hard (Dagum 1993), (2) error decreases as the square of the number of samples, and (3) unexpected evidence on nonroot nodes can reduce the number of useful samples collected.…”
Section: Approximate Inference Simulationmentioning
confidence: 99%
“…All suffer from three problems: (1) the basic approximation problem (for example, the problem of determining if a probability is less than a specified value) is NP-hard (Dagum 1993), (2) error decreases as the square of the number of samples, and (3) unexpected evidence on nonroot nodes can reduce the number of useful samples collected. The last is particularly problematic and has largely restricted the use of such methods to the computation of priors; see Fung (1994).…”
Section: Approximate Inference Simulationmentioning
confidence: 99%