2019
DOI: 10.48550/arxiv.1910.13114
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Contrastive Attention Mechanism for Abstractive Sentence Summarization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
1
1
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 19 publications
0
2
0
Order By: Relevance
“…This method extends the influence of source text keywords on generating the final summary. Duan [23] added a comparison attention mechanism to the Transformer. This attention is calculated from the parameters and weights inside the Transformer to increase attention to irrelevant information between the target language reference summary and the source language text.…”
Section: Auxiliary Generation Methodsmentioning
confidence: 99%
“…This method extends the influence of source text keywords on generating the final summary. Duan [23] added a comparison attention mechanism to the Transformer. This attention is calculated from the parameters and weights inside the Transformer to increase attention to irrelevant information between the target language reference summary and the source language text.…”
Section: Auxiliary Generation Methodsmentioning
confidence: 99%
“…The main issues of applying CL to the language field for the same purpose as in image settings are the difficulty of establishing a general strategy for text 'augmentation', as well as determining many negative samples of an anchor point (Rethmeier and Augenstein 2021). Despite having less popularity than the image counterpart (Jaiswal et al 2021) CL in NLP has been used to pre-train zero-shot predictions (Rethmeier and Augenstein 2020; Pappas and Henderson 2019), and also to improve specific tasks such as language modeling (Logeswaran and Lee 2018;Giorgi et al 2020), text summarization (Duan et al 2019), and many others (Rethmeier and Augenstein 2021). In the previous approaches, the core purpose of the contrastive learning is to obtain more accurate representations or embeddings of texts/sentences for task-specific purposes.…”
Section: Contrastive Learningmentioning
confidence: 99%