Proceedings of the First Workshop on Scholarly Document Processing 2020
DOI: 10.18653/v1/2020.sdp-1.1
|View full text |Cite
|
Sign up to set email alerts
|

Overview of the First Workshop on Scholarly Document Processing (SDP)

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
2
2

Relationship

1
6

Authors

Journals

citations
Cited by 12 publications
(11 citation statements)
references
References 3 publications
0
11
0
Order By: Relevance
“…The First Scholarly Document Processing workshop (Chandrasekaran et al, 2020) comprise three summarization tasks, that each aimed to improve the state-of-the-art of scientific document summarization. In total, we received 18 submissions that addressed one or more of these tasks.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…The First Scholarly Document Processing workshop (Chandrasekaran et al, 2020) comprise three summarization tasks, that each aimed to improve the state-of-the-art of scientific document summarization. In total, we received 18 submissions that addressed one or more of these tasks.…”
Section: Discussionmentioning
confidence: 99%
“…Over time, the Shared Task has spurred the creation of new resources (e.g., ), tools and evaluation frameworks. As a consequence of this wide interest, CL-SciSumm 2020 is jointly organised with the inaugural editions of two other Scientific Summarization shared tasks, all of which were held as part of SDP 2020 workshop at EMNLP 2 ) (Chandrasekaran et al, 2020) A pilot CL-SciSumm task was conducted at TAC 2014, as part of the larger BioMedSumm Task 3 . In 2016, a second CL-Scisumm Shared Task (Jaidka et al, 2018) was held as part of the Joint Workshop on Bibliometric-enhanced Information Retrieval and Natural Language Processing for Digital Libraries (BIRNDL) workshop at the Joint Conference on Digital Libraries (JCDL 2016).…”
Section: Overviewmentioning
confidence: 99%
See 1 more Smart Citation
“…Generating summaries with different language styles can benefit readers of varying literacy levels (Chandrasekaran et al, 2020) or interests (Jin et al, 2020). Significant progress has been made in abstractive summarization with large pre-trained Transformers (Dong et al, 2019;Lewis et al, 2020;Zhang et al, 2019;Raffel et al, 2019;Song et al, 2019).…”
Section: Introductionmentioning
confidence: 99%
“…Significant progress has been made in abstractive summarization with large pre-trained Transformers (Dong et al, 2019;Lewis et al, 2020;Zhang et al, 2019;Raffel et al, 2019;Song et al, 2019). However, style-controlled summarization is much less studied (Chandrasekaran et al, 2020), and two key challenges have been identified: (1) lack of parallel data, and (2) expensive (re)training, e.g., separate summarizers must be trained or finetuned for a pre-defined set of styles (Zhang et al, 2018). Both challenges call for inference time methods built upon trained summarization models, to adjust styles flexibly and efficiently.…”
Section: Introductionmentioning
confidence: 99%