2010
DOI: 10.1145/1831407.1831431
|View full text |Cite
|
Sign up to set email alerts
|

Nonparametric belief propagation

Abstract: In applications of graphical models arising in fields such as computer vision, the hidden variables of interest are most naturally specified by continuous, non-Gaussian distributions. However, due to the limitations of existing inference algorithms, it is often necessary to form coarse, discrete approximations to such models. In this paper, we develop a nonparametric belief propagation (NBP) algorithm, which uses stochastic methods to propagate kernel-based approximations to the true continuous messages. Each … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
328
0
1

Year Published

2011
2011
2022
2022

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 232 publications
(329 citation statements)
references
References 35 publications
0
328
0
1
Order By: Relevance
“…However, a different fate at an internal node would yield a different expectation for the gene expression of the duplicates. We therefore modify the Nonparametric Belief Propagation (NBP) [18] to infer ancestral gene expression levels. NBP allows us to incorporate evolutionary fates (SF, CF, NF) at each internal node to infer ancestral gene expression level.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…However, a different fate at an internal node would yield a different expectation for the gene expression of the duplicates. We therefore modify the Nonparametric Belief Propagation (NBP) [18] to infer ancestral gene expression levels. NBP allows us to incorporate evolutionary fates (SF, CF, NF) at each internal node to infer ancestral gene expression level.…”
Section: Methodsmentioning
confidence: 99%
“…Finally, we repeat forward and backward particle passing procedures using M particles drawn from the re-estimated GMMs. The parameters of GMM will converge after several NBP iterations T [18], where T is often around 500 ~ 1000 in practice.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…This can be done using any applicable inference mechanism. We use nonparametric belief propagation [21] optimized to exploit the specific structure of this inference problem [6]. The particles used to represent the densities are directly derived from individual feature observations.…”
Section: Markov Network For Object Representationmentioning
confidence: 99%
“…We solve them in a unified manner by belief propagation (BP) [21] [22]. Augmenting nodes are introduced as nodes that do not correspond to any specific object but are responsible for generating new object nodes by receiving belief messages from nodes of key objects.…”
Section: Integration Of Observation and Spatio-temporal Contextmentioning
confidence: 99%