2022
DOI: 10.48550/arxiv.2203.02797
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

ClueGraphSum: Let Key Clues Guide the Cross-Lingual Abstractive Summarization

Abstract: Cross-Lingual Summarization (CLS) is the task to generate a summary in one language for an article in a different language. Previous studies on CLS mainly take pipeline methods or train the end-to-end model using the translated parallel data. However, the quality of generated cross-lingual summaries needs more further efforts to improve, and the model performance has never been evaluated on the hand-written CLS dataset. Therefore, we first propose a clue-guided cross-lingual abstractive summarization method to… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 33 publications
0
1
0
Order By: Relevance
“…The method extends the influence of model parameters and weights on generating the final summary. Jiang [24] first extracted key cues, such as keywords and named entities, from the source language text, transformed the source language text into a text graph using a cue-guided algorithm, and then constructed a graph encoder and a cue encoder to encode the text graph and key cues, respectively. The respective outputs were passed into the decoder and finally the output distribution and translation distribution were used together to generate the summary.…”
Section: Auxiliary Generation Methodsmentioning
confidence: 99%
“…The method extends the influence of model parameters and weights on generating the final summary. Jiang [24] first extracted key cues, such as keywords and named entities, from the source language text, transformed the source language text into a text graph using a cue-guided algorithm, and then constructed a graph encoder and a cue encoder to encode the text graph and key cues, respectively. The respective outputs were passed into the decoder and finally the output distribution and translation distribution were used together to generate the summary.…”
Section: Auxiliary Generation Methodsmentioning
confidence: 99%