Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) 2020
DOI: 10.18653/v1/2020.emnlp-main.310
|View full text |Cite
|
Sign up to set email alerts
|

Neural Topic Modeling by Incorporating Document Relationship Graph

Abstract: Graph Neural Networks (GNNs) that capture the relationships between graph nodes via message passing have been a hot research direction in the natural language processing community. In this paper, we propose Graph Topic Model (GTM), a GNN based neural topic model that represents a corpus as a document relationship graph. Documents and words in the corpus become nodes in the graph and are connected based on document-word cooccurrences. By introducing the graph structure, the relationships between documents are e… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 12 publications
(4 citation statements)
references
References 18 publications
0
4
0
Order By: Relevance
“…Other methods such as have no code. We only include methods that leverage language models to do an apple-to-apple comparison and exclude methods using graph neural network (Zhou et al, 2020) or reinforcement learning (Costello and Reformat, 2023b).…”
Section: Compared Methods Selectionmentioning
confidence: 99%
“…Other methods such as have no code. We only include methods that leverage language models to do an apple-to-apple comparison and exclude methods using graph neural network (Zhou et al, 2020) or reinforcement learning (Costello and Reformat, 2023b).…”
Section: Compared Methods Selectionmentioning
confidence: 99%
“…A biterm refers to an unordered word pair that co-occurred in the same document, originally from Yan et al (2013). Similarly, Yang et al (2020); Zhou et al (2020) use a bipartite graph of documents and words, connected by word occurrences or TF-IDF values. Wang et al (2021b) use word co-occurrence and semantic correlation graphs.…”
Section: Ntms With Graph Neural Networkmentioning
confidence: 99%
“…GraphBTM [Zhu et al, 2018], based on the same assumption than BTM, represents biterms as a graph and uses a Graph Neural Network (GCN) [Kipf and Welling, 2017] to extract topic mixtures from them. Zhou et al [2020] proposes the Graph Topic Model (GTM), which extracts topic mixtures by processing a word-document graph with a GNN, to further improve topic coherence. Note that the word-document graph is distinct from an explicit document-document graph, as it solely computed from word occurrences in the documents.…”
Section: Neural Topic Modelsmentioning
confidence: 99%