2012
DOI: 10.1007/978-3-642-31178-9_15
|View full text |Cite
|
Sign up to set email alerts
|

GPU-Accelerated Non-negative Matrix Factorization for Text Mining

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2013
2013
2021
2021

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 20 publications
(8 citation statements)
references
References 8 publications
0
8
0
Order By: Relevance
“…It is worth to mention that our tests do not to include a performance comparison between NMF-mGPU and the other NMF implementations on GPU [ 16 , 40 - 42 ] described in the Background section. As previously stated, these applications do not take into account the available GPU memory, nor make use of multiple GPU devices.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…It is worth to mention that our tests do not to include a performance comparison between NMF-mGPU and the other NMF implementations on GPU [ 16 , 40 - 42 ] described in the Background section. As previously stated, these applications do not take into account the available GPU memory, nor make use of multiple GPU devices.…”
Section: Resultsmentioning
confidence: 99%
“…To the best of our knowledge, there are only a few GPU implementations of the NMF algorithm [ 16 , 40 - 42 ], but these domain-specific applications do not perform any blockwise processing since they do not consider the available amount of GPU memory, nor make use of multiple GPU devices. Therefore, they are not suitable for the analysis of current large biological datasets.…”
Section: Introductionmentioning
confidence: 99%
“…Since Lee and Seung's Nature paper [1], NMF has been extensively studied and has a great deal of applications in science and engineering. It has become an important mathematical method in machine learning and data mining and has been widely used in feature extraction, image analysis [3], audio processing [4], recommendation systems [5,6], pattern recognition, data clustering [7], topic modeling [8], text mining [9], bioinformatics [10], and so forth. Unlike other factorization methods (e.g., PCA, ICA, SVD, VQ, etc.…”
Section: Introductionmentioning
confidence: 99%
“…This representation is more appropriate for non-square data because it explicitly models data interactions through a latent factor S [ 12 ]. Several optimization techniques for parallel non-negative (two-factor) matrix factorization have recently been proposed [ 13 15 ]. These techniques first partition matrix X into blocks and then exploit the block-matrix multiplication when learning U and V .…”
Section: Introductionmentioning
confidence: 99%