2015 IEEE International Conference on Data Mining 2015
DOI: 10.1109/icdm.2015.19
|View full text |Cite
|
Sign up to set email alerts
|

Infinite Author Topic Model Based on Mixed Gamma-Negative Binomial Process

Abstract: Incorporating the side information of text corpus, i.e., authors, time stamps, and emotional tags, into the traditional text mining models has gained significant interests in the area of information retrieval, statistical natural language processing, and machine learning. One branch of these works is the so-called Author Topic Model (ATM), which incorporates the authors's interests as side information into the classical topic model. However, the existing ATM needs to predefine the number of topics, which is di… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
7
1

Relationship

4
4

Authors

Journals

citations
Cited by 10 publications
(6 citation statements)
references
References 27 publications
0
6
0
Order By: Relevance
“…Further, by using dependent GaPs or a mixed GaP-NBP, additional information from the corpus can be introduced into a model. [191] incorporated the links (e.g., citations between scientific papers) between the documents using a dependent GaPs and authorship information using a mixed GaP-NBP [190]. -Multiple corpora.…”
Section: Text Miningmentioning
confidence: 99%
“…Further, by using dependent GaPs or a mixed GaP-NBP, additional information from the corpus can be introduced into a model. [191] incorporated the links (e.g., citations between scientific papers) between the documents using a dependent GaPs and authorship information using a mixed GaP-NBP [190]. -Multiple corpora.…”
Section: Text Miningmentioning
confidence: 99%
“…On the other hand, exploring hidden information from observed data have been common desired in applications of data mining like recommend systems [18], information retrieval [32], statistical natural language processing [16] and so on. Among them, probabilistic graphical models are widely used since its huge success in classifying topics from contexts [3].…”
Section: Related Workmentioning
confidence: 99%
“…A similar idea was adopted in the gamma-negative binomial process [43,44], beta-negative binomial process [45], hierarchical beta process [46] and hierarchical Poisson models [47]. Different stochastic processes, e.g., beta, Gamma, Poisson and negative binomial processes, used in these models are piled to account for different kinds of data (i.e., binary or count data) in the hierarchical structure.…”
Section: Learning From Hierarchical Structures Using Bayesian Nonpara...mentioning
confidence: 99%