2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops) 2011
DOI: 10.1109/iccvw.2011.6130310
|View full text |Cite
|
Sign up to set email alerts
|

Sum-product networks: A new deep architecture

Abstract: The key limiting factor in graphical model inference and learning is the complexity of the partition function. We thus ask the question: what are general conditions under which the partition function is tractable? The answer leads to a new kind of deep architecture, which we call sumproduct networks (SPNs). SPNs are directed acyclic graphs with variables as leaves, sums and products as internal nodes, and weighted edges. We show that if an SPN is complete and consistent it represents the partition function and… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

1
712
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 383 publications
(713 citation statements)
references
References 23 publications
1
712
0
Order By: Relevance
“…In the case where an exclusive disjunction is necessary, this requires to replace the classification mechanism (based on W in the proposed network) by a more complex one. Sum-product networks [69] could be an interesting source of inspiration for this replacement.…”
Section: Perspectives and Future Workmentioning
confidence: 99%
“…In the case where an exclusive disjunction is necessary, this requires to replace the classification mechanism (based on W in the proposed network) by a more complex one. Sum-product networks [69] could be an interesting source of inspiration for this replacement.…”
Section: Perspectives and Future Workmentioning
confidence: 99%
“…Using the differential approach introduced in [4], inference is also conceptional easy in these models. In this paper, we consider sum-product networks (SPNs) introduced in [6]. SPNs can be interpreted as Bayesian networks with a deep hierarchical structure of latent variables with a high degree of context-specific independence.…”
Section: Introductionmentioning
confidence: 99%
“…In [4,5,6] and related work, novel types of probabilistic models emerged which allow to control the inference cost during learning but still modeling complex variable dependencies. Using the differential approach introduced in [4], inference is also conceptional easy in these models.…”
Section: Introductionmentioning
confidence: 99%
“…When it comes to the use of deep learning for generation tasks, we can find various models, such as a deep Boltzmann machines (DBM) [8], [9], a denoising auto-encoder (DAE) [10], a shape Boltzmann machine (ShapeBM) [11], and a sum-product network (SPN) [12]. These models were mainly introduced to capture high-order abstractions for good representation of the observations, rather than for discriminative goal.…”
Section: Introductionmentioning
confidence: 99%
“…In the image generation experiments, we obtained much more realistic images generated from the DRM more than those from the other generative models. [11], and a sum-product network (SPN) [12]. These models were mainly introduced to capture high-order abstractions for good representation of the observations, rather than for discriminative goal.…”
mentioning
confidence: 99%