2019 International Joint Conference on Neural Networks (IJCNN) 2019
DOI: 10.1109/ijcnn.2019.8852366
|View full text |Cite
|
Sign up to set email alerts
|

Short Text Topic Modeling with Flexible Word Patterns

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2020
2020
2025
2025

Publication Types

Select...
3
3
1
1

Relationship

3
5

Authors

Journals

citations
Cited by 14 publications
(10 citation statements)
references
References 19 publications
0
10
0
Order By: Relevance
“…To improve the performance of short text topic modeling, Biterm Topic Model (BTM) (Yan et al, 2013) and Dirichlet Multinomial Mixture (DMM) model (Nigam et al, 2000;Sadamitsu et al, 2007;Yin and Wang, 2014) are two basic short text probabilistic topic models which employ traditional Bayesian inference methods including Gibbs Sampling (Steyvers and Griffiths, 2007) and Variational Inference (Blei et al, 2017). Several extensions based on BTM and DMM are also proposed, such as Generalized Pólya Urn-DMM (GPUDMM) (Li et al, 2016) with word embeddings and Multiterm Topic Model (Wu and Li, 2019). Besides, Semantics-assisted Non-negative Matrix Factorization (SeaNMF) (Shi et al, 2018) was lately proposed as an NMF topic model incorporating word-context semantic correlations solved by a block coordinate descent algorithm.…”
Section: Related Workmentioning
confidence: 99%
“…To improve the performance of short text topic modeling, Biterm Topic Model (BTM) (Yan et al, 2013) and Dirichlet Multinomial Mixture (DMM) model (Nigam et al, 2000;Sadamitsu et al, 2007;Yin and Wang, 2014) are two basic short text probabilistic topic models which employ traditional Bayesian inference methods including Gibbs Sampling (Steyvers and Griffiths, 2007) and Variational Inference (Blei et al, 2017). Several extensions based on BTM and DMM are also proposed, such as Generalized Pólya Urn-DMM (GPUDMM) (Li et al, 2016) with word embeddings and Multiterm Topic Model (Wu and Li, 2019). Besides, Semantics-assisted Non-negative Matrix Factorization (SeaNMF) (Shi et al, 2018) was lately proposed as an NMF topic model incorporating word-context semantic correlations solved by a block coordinate descent algorithm.…”
Section: Related Workmentioning
confidence: 99%
“…However, in this model, some words fail to have semantic relevance and tend to reduce efficiency. Wu and Li (2019) established the Multi-term Topic Model (MTM), which extracts the length of variable and multiple correlative word patterns from short texts to infer the latent trending topics. This model overcomes the limitations of BTM, such as extracting many irrelevant and useless bi-terms.…”
Section: Global Word Co-occurrences Based Asttm Modelsmentioning
confidence: 99%
“…Multiple machine learning based implementation of various topic modelling techniques have been extensively studied by many researchers [ 2,3,4,5,6]. Organizing, searching, and summarizing large volume of textual data are the major tasks involved in NLP and the topic modelling can be used to solve these problems to certain extent.…”
Section: Review Of Related Workmentioning
confidence: 99%
“…There is a demand for an automatic topic modeling system and text summarization, due to the enormous volume of text data available in present times and the limitation of human reading abilities. There are multiple techniques used to discover the topics from text, images, and video [1] Topic models have been extensively utilized in Topic Recognition and Tracking tasks, which helps to track, detect, and designate topics from a stream of documents or texts [4]. This machine learning technique is widely used in NLP applications to analyze the unstructured textual data and automatically discover the abstract topics within it.…”
Section: Introductionmentioning
confidence: 99%