2013 IEEE International Workshop on Machine Learning for Signal Processing (MLSP) 2013
DOI: 10.1109/mlsp.2013.6661904
|View full text |Cite
|
Sign up to set email alerts
|

Large scale inference in the Infinite Relational Model: Gibbs sampling is not enough

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
2
2
1

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 8 publications
0
2
0
Order By: Relevance
“…Due to the size of the networks, it is not computationally feasible to reach convergence. However, the Gibbs sampler quickly reach a stable cluster assignment with high posterior likelihood which is treated as a point estimate of the parameters (Albers et al, 2013 ). We hence treat the last sampled state as the inferred parameters.…”
Section: Methodsmentioning
confidence: 99%
“…Due to the size of the networks, it is not computationally feasible to reach convergence. However, the Gibbs sampler quickly reach a stable cluster assignment with high posterior likelihood which is treated as a point estimate of the parameters (Albers et al, 2013 ). We hence treat the last sampled state as the inferred parameters.…”
Section: Methodsmentioning
confidence: 99%
“…We also observe that Bayesian Plaid model does not necessarily obtain the perfect recovery even if K = K true . This may be explained by the fact that the MCMC inferences on BNP exhaustive bi-clustering models are easily trapped at local optimum in practice (Albers et al 2013). We expect the performance will be improved with more MCMC iterations.…”
Section: Synthetic Data Experimentsmentioning
confidence: 99%