2022
DOI: 10.1109/tcss.2021.3088506
|View full text |Cite
|
Sign up to set email alerts
|

T-BERTSum: Topic-Aware Text Summarization Based on BERT

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
15
0
2

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 68 publications
(17 citation statements)
references
References 29 publications
0
15
0
2
Order By: Relevance
“…First Story Detection UMASS [10], LSH-UMASS [11], PAR-UMASS [12], K-Term [13], Rel-EFSD [1] Rel-EFSD [1] Question Answering PARALEX [14], SEMPRE [15], ParSEM-PRE MemNets [14], BERT [16], T5 [8], AL-BERT [17], DistiBERT [18], GPT-3 [19] BERT [16] Information Extraction Ollie [20], ReVerb [21], Open-IE [22], Tex-tRunner [23], RelNoun [24], CALMIE [25] ReVerb [21] Summarization T-BERTSum [26], PEGASUS [27], T5 [8] T5 [8] Fake News Detection TriFN [28], FakeNewsTracker [29], Tree CRF [30] BERT [16]…”
Section: Mining Task Relevant Methods News Monitormentioning
confidence: 99%
“…First Story Detection UMASS [10], LSH-UMASS [11], PAR-UMASS [12], K-Term [13], Rel-EFSD [1] Rel-EFSD [1] Question Answering PARALEX [14], SEMPRE [15], ParSEM-PRE MemNets [14], BERT [16], T5 [8], AL-BERT [17], DistiBERT [18], GPT-3 [19] BERT [16] Information Extraction Ollie [20], ReVerb [21], Open-IE [22], Tex-tRunner [23], RelNoun [24], CALMIE [25] ReVerb [21] Summarization T-BERTSum [26], PEGASUS [27], T5 [8] T5 [8] Fake News Detection TriFN [28], FakeNewsTracker [29], Tree CRF [30] BERT [16]…”
Section: Mining Task Relevant Methods News Monitormentioning
confidence: 99%
“…Farahani [40] mentioned two approaches to deal with the task of text summarization using mT5 model and ParsBERT model and got good results on a data set named pn-summary for Persian abstractive text summarization. Ma [41] proposed a pre-trained model T-BERTSum for text summarization, which captures the key words of the topic information of social media, understands the meaning of the sentence and judges the topic of the message discussion, and then generates a high-quality section Summary. Kerui [42] uses BERT, Seq2seq and reinforcement learning to form a text summary model.…”
Section: Related Workmentioning
confidence: 99%
“…In summary, in our work, social media (such as Twitter), pre-trained models, and text summaries (including extractive and abstractive methods) are the three key elements of event summarization. From the above literature review, the text summaries of specific event tweets on social media have gradually become one of the most popular summarization research topics in recent years [1][2][3]41,42]. However, extractive summarization cannot obtain many key sentences in the tweet-form data, resulting in unsatisfactory quality of the generated summaries.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations