2020
DOI: 10.48550/arxiv.2012.14136
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

On Generating Extended Summaries of Long Documents

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 0 publications
0
3
0
Order By: Relevance
“…While work on summarizing novels is sparse, there has been plenty of work on summarizing other kinds of long documents, such as scientific papers (Abu-Jbara and Radev, 2011;Collins et al, 2017;Subramanian et al, 2019;Cohan et al, 2018;Xiao and Carenini, 2019;Zhao et al, 2020;Sotudeh et al, 2020), and patents (Sharma et al, 2019), as well as multi-document summarization (Liu et al, 2018;Ma et al, 2020;Gharebagh et al, 2020;Chandrasekaran et al, 2020;Liu and Lapata, 2019a;Gao et al, 2020). Many of these techniques use a hierarchical approach to generating final summaries, either by having a hierarchical encoder (Cohan et al, 2018;Zhang et al, 2019c;Liu and Lapata, 2019a), or by first running an extractive summarization model followed by an abstractive model (Subramanian et al, 2019;Liu et al, 2018;Zhao et al, 2020;Gharebagh et al, 2020).…”
Section: Related Workmentioning
confidence: 99%
“…While work on summarizing novels is sparse, there has been plenty of work on summarizing other kinds of long documents, such as scientific papers (Abu-Jbara and Radev, 2011;Collins et al, 2017;Subramanian et al, 2019;Cohan et al, 2018;Xiao and Carenini, 2019;Zhao et al, 2020;Sotudeh et al, 2020), and patents (Sharma et al, 2019), as well as multi-document summarization (Liu et al, 2018;Ma et al, 2020;Gharebagh et al, 2020;Chandrasekaran et al, 2020;Liu and Lapata, 2019a;Gao et al, 2020). Many of these techniques use a hierarchical approach to generating final summaries, either by having a hierarchical encoder (Cohan et al, 2018;Zhang et al, 2019c;Liu and Lapata, 2019a), or by first running an extractive summarization model followed by an abstractive model (Subramanian et al, 2019;Liu et al, 2018;Zhao et al, 2020;Gharebagh et al, 2020).…”
Section: Related Workmentioning
confidence: 99%
“…The dot product of the embeddings of the two terms qi and dj may thus be used to identify their similarities. Many unrelated words have dot products that are higher than zero, indicating syntactic and semantic relatedness [18], although this is not always the case. The noise they contribute is detrimental to the retrieval model's performance.…”
Section: Amentioning
confidence: 99%
“…To uncover the relationship between words, word embedding models are trained on a huge corpus. Word embeddings are subjected to a post-processing phase as "retrofitting" [18] in this approach.…”
Section: Table 1: Similarities Between Pairs Of Wordsmentioning
confidence: 99%