Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics 2020
DOI: 10.18653/v1/2020.acl-main.32
|View full text |Cite
|
Sign up to set email alerts
|

Neural Topic Modeling with Bidirectional Adversarial Training

Abstract: Recent years have witnessed a surge of interests of using neural topic models for automatic topic extraction from text, since they avoid the complicated mathematical derivations for model inference as in traditional topic models such as Latent Dirichlet Allocation (LDA). However, these models either typically assume improper prior (e.g. Gaussian or Logistic Normal) over latent topic space or could not infer topic distribution for a given document. To address these limitations, we propose a neural topic modelin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
21
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 56 publications
(21 citation statements)
references
References 20 publications
0
21
0
Order By: Relevance
“…As such since one important goal of topic models is to be used as tools for humans to make sense of and explore document collections, the recent trend is to evaluate topics based on coherence metrics. The work of (Röder et al, 2015) and (Wang et al, 2020) are examples of topic coherence evaluation. Here, we take the same approach to evaluation.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…As such since one important goal of topic models is to be used as tools for humans to make sense of and explore document collections, the recent trend is to evaluate topics based on coherence metrics. The work of (Röder et al, 2015) and (Wang et al, 2020) are examples of topic coherence evaluation. Here, we take the same approach to evaluation.…”
Section: Resultsmentioning
confidence: 99%
“…Another model based on bi-directional GANs is the Gaussian-BAT (Wang et al, 2020) which uses a Drichlet prior and can infer topic distributions of input documents. Additionally, Gaussian-BAT models a topic using a multivariate Gaussian and incorporates the word relatedness into the modeling process.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…To overcome the problem that the Dirichlet distribution is not a location scale family, which hinders the reparameterization utilized in the VAE framework, Srivastava and Sutton (2017) employ a Laplace approximation for modeling a Dirichlet prior of the latent variables; Joo et al (2020) approximate the inverse cumulative distribution function of the Gamma distribution, which is a component of the Dirichlet distribution; Zhang et al (2018) utilize the Weibull distribution; and Burkhardt and Kramer (2019a) solve this problem based on rejection samples. Meanwhile, to utilize Dirichlet prior, Wang et al (2020a) abandon the VAE framework, and propose Bidirectional Adversarial Topic (BAT) model, which applies bidirectional adversarial training for neural topic modeling. All these methods of unsupervised neural topic models have achieved competitive results.…”
Section: Related Workmentioning
confidence: 99%
“…The proposed ToMCAT is partly inspired by ATM but differs in its capability of inferring documentspecific topic distributions and incorporating supervision information. BAT (Wang et al, 2020) is an extension to ATM that employs bidirectional adversarial training (Donahue et al, 2016) for documentspecific topic distribution inference. Although BAT similarly utilizes an adversarial training objective to guide the learning of topic distribution, there are some major differences.…”
Section: Neural Topic Modelingmentioning
confidence: 99%