2021
DOI: 10.1007/978-3-030-91669-5_34
|View full text |Cite
|
Sign up to set email alerts
|

ContriSci: A BERT-Based Multitasking Deep Neural Architecture to Identify Contribution Statements from Research Papers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 11 publications
0
2
0
Order By: Relevance
“…Banerjee et al [5] apply sequential transfer learning from the medical to the computer science domain for discourse classification, however, only for two domains and on abstracts, whereas Spangher et al [66] explore this task on news articles with multi-task learning using multiple datasets. Gupta et al [29] utilise a multi-task learning with two scaffold tasks to detect contribution sentences in full papers, however, only in one domain and with limited sentence context. Several approaches also exist to train multiple tasks jointly: Luan et al [46] train a model on three tasks (coreference resolution, entity and relation extraction) using one dataset of research papers.…”
Section: Transfer Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…Banerjee et al [5] apply sequential transfer learning from the medical to the computer science domain for discourse classification, however, only for two domains and on abstracts, whereas Spangher et al [66] explore this task on news articles with multi-task learning using multiple datasets. Gupta et al [29] utilise a multi-task learning with two scaffold tasks to detect contribution sentences in full papers, however, only in one domain and with limited sentence context. Several approaches also exist to train multiple tasks jointly: Luan et al [46] train a model on three tasks (coreference resolution, entity and relation extraction) using one dataset of research papers.…”
Section: Transfer Learningmentioning
confidence: 99%
“…First, although some approaches propose transfer learning for the scientific domain [5,10,29,53], the field lacks a comprehensive empirical study on transfer learning across different scientific domains for sequential sentence classification. Transfer learning enables the combination of knowledge from multiple datasets to improve classification performance and thus to reduce annotation costs.…”
Section: Introductionmentioning
confidence: 99%