Findings of the Association for Computational Linguistics: EMNLP 2021 2021
DOI: 10.18653/v1/2021.findings-emnlp.106
|View full text |Cite
|
Sign up to set email alerts
|

Topic-Aware Contrastive Learning for Abstractive Dialogue Summarization

Abstract: Unlike well-structured text, such as news reports and encyclopedia articles, dialogue content often comes from two or more interlocutors, exchanging information with each other. In such a scenario, the topic of a conversation can vary upon progression and the key information for a certain topic is often scattered across multiple utterances of different speakers, which poses challenges to abstractly summarize dialogues. To capture the various topic information of a conversation and outline salient facts for the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
17
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 34 publications
(17 citation statements)
references
References 39 publications
0
17
0
Order By: Relevance
“…Feng et al [46] utilize DialoGPT [220], a PLM specially designed for dialogue, to automatically extract the keywords, detect redundant utterances and divide a dialogue into topically coherent segments. Similarly, ConDigSum [114] detects the dialogue topic transfer and generates summaries for each topic using contrastive learning.…”
Section: Text Summarizationmentioning
confidence: 99%
See 3 more Smart Citations
“…Feng et al [46] utilize DialoGPT [220], a PLM specially designed for dialogue, to automatically extract the keywords, detect redundant utterances and divide a dialogue into topically coherent segments. Similarly, ConDigSum [114] detects the dialogue topic transfer and generates summaries for each topic using contrastive learning.…”
Section: Text Summarizationmentioning
confidence: 99%
“…To address the issue of key points missing in output text, Nguyen et al [134] introduced a topic model to capture the global semantics of the document and a mechanism to control the amount of global semantics supplied to the text generation module. Similarly, Liu et al [114] also proposed two topic-aware contrastive learning objectives to capture the global topic information of a conversation and outline salient facts. These objectives are able to implicitly model the topic change varying upon conversations, pushing PLMs to focus more on snippets that contain salient information from the same topics.…”
Section: Document Representation Learningmentioning
confidence: 99%
See 2 more Smart Citations
“…These work also modify the basic transformer-based models with additional encoders (Chen and Yang, 2020) or attention layers Lei et al, 2021; to utilize the injected features. Liu et al (2021a) propose a contrastive learning approach for dialogue summarization with multiple training objectives. They also introduce a number of hyper-parameters for contrastive dataset construction and balancing among those objectives.…”
Section: A Related Workmentioning
confidence: 99%