2023
DOI: 10.1109/access.2023.3277754
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Encoder Transformer for Korean Abstractive Text Summarization

Abstract: In this paper, we propose a Korean abstractive text summarization approach that uses a multi-encoder transformer. Recently, in many natural language processing (NLP) tasks, the use of the pre-trained language models (PLMs) for transfer learning has achieved remarkable performance. In particular, transformer-based models such as Bidirectional Encoder Representations from Transformers (BERT) are used for pre-training and applied to downstream tasks, showing state-of-the-art performance including abstractive text… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
references
References 29 publications
0
0
0
Order By: Relevance