Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing 2022
DOI: 10.18653/v1/2022.emnlp-main.409
|View full text |Cite
|
Sign up to set email alerts
|

Salience Allocation as Guidance for Abstractive Summarization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 10 publications
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…BART [6] and PEGASUS [15] are encoder-decoder-based pre-trained language models. SEASON [18] is a model that jointly learns extractive and abstractive summarization based on BART. Second, SimCLS [4] and SummaReranker [11] are two-stage models that use encoder-only models as second-stage re-ranking models.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…BART [6] and PEGASUS [15] are encoder-decoder-based pre-trained language models. SEASON [18] is a model that jointly learns extractive and abstractive summarization based on BART. Second, SimCLS [4] and SummaReranker [11] are two-stage models that use encoder-only models as second-stage re-ranking models.…”
Section: Resultsmentioning
confidence: 99%
“…In text summarization tasks, various attempts have been made to enable models to learn detailed information contained in a text. SEASON [18] introduced a salienceaware cross-attention module to allow the model to better focus on key sentences in the source document. The model was learned by jointly performing extractive and abstractive summarization.…”
Section: Approaches To Reflecting Detailed Information In Text Summar...mentioning
confidence: 99%