2014
DOI: 10.1016/j.neucom.2012.02.055
|View full text |Cite
|
Sign up to set email alerts
|

Two Expectation-Maximization algorithms for Boolean Factor Analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(9 citation statements)
references
References 31 publications
0
9
0
Order By: Relevance
“…In [17] the authors use a Bayesian framework to consider the BMF problem as a maximum log likelihood problem and use a message passing procedure to approximate the MAP assignment. Along the line with using the statistical procedures, the authors in [18]- [20] proposed algorithms based on expectation maximization to solve the Boolean factor analysis problem. The extracted factors are thought to represent the hidden cause for the observed binary data.…”
Section: Related Workmentioning
confidence: 99%
“…In [17] the authors use a Bayesian framework to consider the BMF problem as a maximum log likelihood problem and use a message passing procedure to approximate the MAP assignment. Along the line with using the statistical procedures, the authors in [18]- [20] proposed algorithms based on expectation maximization to solve the Boolean factor analysis problem. The extracted factors are thought to represent the hidden cause for the observed binary data.…”
Section: Related Workmentioning
confidence: 99%
“…To solve the above inference problem, they introduce a graphical model for the posterior and use belief propagation to find the matrices B and C which maximize the posterior. Frolov et al (2014) and Liang and Lu (2019) used a similar generative model and expectation maximization instead of belief propagation. Rukat et al (2017a) used a Metropolised Gibbs sampler for the posterior inference.…”
Section: Continuous Optimizationmentioning
confidence: 99%
“…Similarly, inference and learning for sparse coding models that replace the linear combination by nonlinear ones have been investigated. Hidden causes models with nonlinearly interacting signal sources include the noisy-or combination rule [ 25 30 ], exclusive causes [ 31 ] or a maximum superposition [ 19 , 20 , 32 ]. Also a combination of linear superposition followed by a sigmoidal nonlinearity (post-linear nonlinearities) have been investigated (nonlinear ICA [ 33 ], sigmoid belief networks [ 34 ]).…”
Section: Model: Nonlinear Spike-and-slab Sparse Codingmentioning
confidence: 99%