Proceedings of the 12th International Conference on Natural Language Generation 2019
DOI: 10.18653/v1/w19-8624
|View full text |Cite
|
Sign up to set email alerts
|

BERT for Question Generation

Abstract: In this study, we investigate the employment of the pre-trained BERT language model to tackle question generation tasks. We introduce two neural architectures built on top of BERT for question generation tasks. The first one is a straightforward BERT employment, which reveals the defects of directly using BERT for text generation. And, the second one remedies the first one by restructuring the BERT employment into a sequential manner for taking information from previous decoded results. Our models are trained … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 16 publications
(9 citation statements)
references
References 5 publications
0
9
0
Order By: Relevance
“…Recently, applying large-scale pretrained models in QG attracts more and more researchers' interests. Chan et al build a recurrent BERT to output one question word at a recurrent step [11,12], but it is time-consuming. The generative pretrained models such as UNILM [18], T5 [57], PEGASUS [82], and UNILMV2 [5] report the model's QG scores finetuned on SQuAD [58] dataset, but they do not explore the idea of building a unified QG.…”
Section: Related Work 21 Question Generationmentioning
confidence: 99%
“…Recently, applying large-scale pretrained models in QG attracts more and more researchers' interests. Chan et al build a recurrent BERT to output one question word at a recurrent step [11,12], but it is time-consuming. The generative pretrained models such as UNILM [18], T5 [57], PEGASUS [82], and UNILMV2 [5] report the model's QG scores finetuned on SQuAD [58] dataset, but they do not explore the idea of building a unified QG.…”
Section: Related Work 21 Question Generationmentioning
confidence: 99%
“…These massively overparameterized neural networks have revolutionized many different NLP tasks. Effective application of BERT in NMT has been studied in a number of contemporary research projects; Language Modeling, Named Entity Recognition, Question Answering, Natural Language Inference, Text Classification (Devlin et al, 2019), and Question Generation (Chan and Fan, 2019). We approach this problem from the novel perspective of extracting linguistic information encoded in BERT and applying such information in NMT.…”
Section: Nmt and Bertmentioning
confidence: 99%
“…These massively overparameterized neural networks have revolutionized many different NLP tasks. Effective application of BERT in NMT has been studied in a number of contemporary research projects; Language Modeling, Named Entity Recognition, Question Answering, Natural Language Inference, Text Classification (Devlin et al, 2019), and Question Generation (Chan and Fan, 2019). We approach this problem from the novel perspective of extracting linguistic information encoded in BERT and applying such information in NMT.…”
Section: Nmt and Bertmentioning
confidence: 99%