2009
DOI: 10.1007/978-3-642-02568-6_50
|View full text |Cite
|
Sign up to set email alerts
|

Accelerating Collapsed Variational Bayesian Inference for Latent Dirichlet Allocation with Nvidia CUDA Compatible Devices

Abstract: Abstract. In this paper, we propose an acceleration of collapsed variational Bayesian (CVB) inference for latent Dirichlet allocation (LDA) by using Nvidia CUDA compatible devices. While LDA is an efficient Bayesian multi-topic document model, it requires complicated computations for parameter estimation in comparison with other simpler document models, e.g. probabilistic latent semantic indexing, etc. Therefore, we accelerate CVB inference, an efficient deterministic inference method for LDA, with Nvidia CUDA… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2009
2009
2016
2016

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 11 publications
(12 citation statements)
references
References 12 publications
0
12
0
Order By: Relevance
“…• Masada, et al [8] and Yan, et al [13] made use of GPGPU for the parallel inference of LDA. Their focus was on the limit of memory size of GPGPU, which is dierent from our motivation of this paper.…”
Section: Fast Inference Methods For Ldamentioning
confidence: 99%
“…• Masada, et al [8] and Yan, et al [13] made use of GPGPU for the parallel inference of LDA. Their focus was on the limit of memory size of GPGPU, which is dierent from our motivation of this paper.…”
Section: Fast Inference Methods For Ldamentioning
confidence: 99%
“…Note that there are NaN data in FDR of VLR, which can not be shown in Figure 2 and Figure 3. This is due to the fact that, when the sample size is too small compared with the number of regression coefficients, VLR will infers all coefficients to some small values, thus all the regressors will be determined 10 as 0, therefore VLR will have '0 n  , and the FDR will be NaN.…”
Section: Part Of Prior Mean Of the Nonzero Coefficients In Models Arementioning
confidence: 99%
“…Meanwhile, variational approximation techniques have also been extensively studied to solve the linear regression models [8][9][10][11][12]. The variational approximation methods assume specific priori distributions, adopt the approximation to the joint posterior distribution, and provide an analytical iterative expression for the posterior probability of the unknown regressors.…”
Section: Introductionmentioning
confidence: 99%
“…While the above formula is computationally intractable in most cases, approximations exist. Using collapsed Gibbs sampling with randomization, 26x speedup has been achieved over a serial implementation [57], and 196x speedup by using collapsed variational Bayesian [31,57], which is a deterministic method that avoids randomization.…”
Section: Text Mining On Gpusmentioning
confidence: 99%