Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics 2019
DOI: 10.18653/v1/p19-1191
|View full text |Cite
|
Sign up to set email alerts
|

PaperRobot: Incremental Draft Generation of Scientific Ideas

Abstract: We present a PaperRobot who performs as an automatic research assistant by (1) conducting deep understanding of a large collection of human-written papers in a target domain and constructing comprehensive background knowledge graphs (KGs); (2) creating new ideas by predicting links from the background KGs, by combining graph attention and contextual text attention; (3) incrementally writing some key elements of a new paper based on memory-attention networks: from the input title along with predicted related en… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
24
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3
2

Relationship

2
7

Authors

Journals

citations
Cited by 31 publications
(24 citation statements)
references
References 51 publications
0
24
0
Order By: Relevance
“…Describing their system called PaperRobot, built prior to the pandemic outbreak, Wang et al describe that "creating new nodes usually means discovering new entities (e.g. new proteins) through a series of laboratory experiments, which is probably too difficult for PaperRobot" [55].…”
Section: Knowledge Graph Constructionmentioning
confidence: 99%
“…Describing their system called PaperRobot, built prior to the pandemic outbreak, Wang et al describe that "creating new nodes usually means discovering new entities (e.g. new proteins) through a series of laboratory experiments, which is probably too difficult for PaperRobot" [55].…”
Section: Knowledge Graph Constructionmentioning
confidence: 99%
“…Text generation in the scientific domain has achieved progress in several ways. Wang et al (2019) generates the paper abstract from the input title along with predicted entities in the related papers and further generates the paragraphs for the conclusion and future work. Demir et al (2019) generates the LATEX source code with a sequenceto-sequence model in a straightforward manner.…”
Section: Text Generation In Scientific Domainmentioning
confidence: 99%
“…To enable the model to generate plausible paragraphs, we introduce context-aware paragraph generation, which is a task that aims to generate paragraphs for the "Introduction" section. Unlike previous works, which generates paragraph using limited information (e.g., abstract) (Wang et al, 2019;Demir et al, 2019;, we provide the model with substantial contextual information C, which are the body texts in the cited papers. For simplicity, we only use the cited papers involved in the "Introduction" section and ignore the objects in them.…”
Section: Context-aware Paragraph Generationmentioning
confidence: 99%
“…With drug repurposing as a case study, we focus on 11 typical questions that human experts pose and integrate our techniques to generate a comprehensive report for each candidate drug. Our coarse-grained Information Extraction (IE) system consists of three components: (1) coarsegrained entity extraction (Wang et al, 2019a) and entity linking (Zheng et al, 2015) for four entity types: Gene nodes, Disease nodes, Chemical nodes, and Organism. We follow the entity ontology defined in the Comparative Toxicogenomics Database (CTD) (Davis et al, 2016), and obtain a Medical Subject Headings (MeSH) Unique ID for each mention.…”
Section: Introductionmentioning
confidence: 99%