Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing 2021
DOI: 10.18653/v1/2021.emnlp-main.744
|View full text |Cite
|
Sign up to set email alerts
|

Enriching and Controlling Global Semantics for Text Summarization

Abstract: Recently, Transformer-based models have been proven effective in the abstractive summarization task by creating fluent and informative summaries. Nevertheless, these models still suffer from the short-range dependency problem, causing them to produce summaries that miss the key points of document. In this paper, we attempt to address this issue by introducing a neural topic model empowered with normalizing flow to capture the global semantics of the document, which are then integrated into the summarization mo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2
2

Relationship

1
6

Authors

Journals

citations
Cited by 17 publications
(6 citation statements)
references
References 37 publications
0
6
0
Order By: Relevance
“…Our model belongs to the first category, akin to prior studies like Setiawan et al (2020);Fu et al (2020). In contrast, other works, including Zheng et al (2020); Nguyen et al (2021); Zou et al (2021), adopt the second type by jointly modeling topics and sequence-to-sequence generation. Most of them assume a simple Gaussian latent prior, except for Nguyen et al (2021), which employs normalizing flows to model neural topic models and enrich global semantics.…”
Section: Variational Summarizationmentioning
confidence: 99%
See 1 more Smart Citation
“…Our model belongs to the first category, akin to prior studies like Setiawan et al (2020);Fu et al (2020). In contrast, other works, including Zheng et al (2020); Nguyen et al (2021); Zou et al (2021), adopt the second type by jointly modeling topics and sequence-to-sequence generation. Most of them assume a simple Gaussian latent prior, except for Nguyen et al (2021), which employs normalizing flows to model neural topic models and enrich global semantics.…”
Section: Variational Summarizationmentioning
confidence: 99%
“…In contrast, other works, including Zheng et al (2020); Nguyen et al (2021); Zou et al (2021), adopt the second type by jointly modeling topics and sequence-to-sequence generation. Most of them assume a simple Gaussian latent prior, except for Nguyen et al (2021), which employs normalizing flows to model neural topic models and enrich global semantics. However, they did not specify the choice of normalizing flows and how they addressed posterior collapse.…”
Section: Variational Summarizationmentioning
confidence: 99%
“…Therefore, it is important to retain the most critical contents of documents and verbalize them in the generated text. To address the issue of key points missing in output text, Nguyen et al [134] introduced a topic model to capture the global semantics of the document and a mechanism to control the amount of global semantics supplied to the text generation module. Similarly, Liu et al [114] also proposed two topic-aware contrastive learning objectives to capture the global topic information of a conversation and outline salient facts.…”
Section: Document Representation Learningmentioning
confidence: 99%
“…CIT [162] employs an extractor (RoBERTa) to extract the important words and sentences from the input, which will be fed into encoder with the input. In addition, topic models can capture the global semantics of the document, which can integrated into the summarization model [134]. Finally, GSum [36] proposes a general framework taking different kinds of guidance into the generation model, including keywords, triples, highlighted sentences and retrieved summaries.…”
Section: Text Summarizationmentioning
confidence: 99%
See 1 more Smart Citation