The World Wide Web Conference 2019
DOI: 10.1145/3308558.3313737
|View full text |Cite
|
Sign up to set email alerts
|

Learning to Generate Questions by LearningWhat not to Generate

Abstract: Automatic question generation is an important technique that can improve the training of question answering, help chatbots to start or continue a conversation with humans, and provide assessment materials for educational purposes. Existing neural question generation models are not sufficient mainly due to their inability to properly model the process of how each word in the question is selected, i.e., whether repeating the given passage or being generated from a vocabulary. In this paper, we propose our Clue G… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
65
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 86 publications
(65 citation statements)
references
References 47 publications
0
65
0
Order By: Relevance
“…The copy probability of each input word is given by the attention weights in Equation (10). It has been reported that the generated words in a target question are usually from frequent words, while the majority of lowfrequency words in the long tail are copied from the input instead of generated [27]. Therefore, we reduce the vocabulary size to be the top N V high-frequency words at both the encoder and the decoder, where N V is a predefined threshold that varies among different datasets.…”
Section: 21mentioning
confidence: 99%
See 1 more Smart Citation
“…The copy probability of each input word is given by the attention weights in Equation (10). It has been reported that the generated words in a target question are usually from frequent words, while the majority of lowfrequency words in the long tail are copied from the input instead of generated [27]. Therefore, we reduce the vocabulary size to be the top N V high-frequency words at both the encoder and the decoder, where N V is a predefined threshold that varies among different datasets.…”
Section: 21mentioning
confidence: 99%
“…• NQG-Knowledge [16], DLPH [12]: auxiliary-informationenhanced question generation models with extra inputs such as knowledge or difficulty. • Self-training-EE [38], BERT-QG-QAP [51], NQG-LM [55], CGC-QG [27] and QType-Predict [56]: multi-task question generation models with auxiliary tasks such as question answering, language modeling, question type prediction and so on.…”
Section: Evaluating Acs-aware Question Generationmentioning
confidence: 99%
“…Automatic question generation has attracted an increasing attention from the natural language gen-eration community in recent years, which is reflected in newly published datasets (Zhou et al, 2017;Chen et al, 2018) and sophisticated techniques (Du et al, 2017;Liu et al, 2019). Traditional methods are mainly rule-based, where they first transform the source information into syntactic representation and then use templates to generate related questions (Heilman, 2011).…”
Section: Related Workmentioning
confidence: 99%
“…As an extension of the convolutional network, graph convolutional network, introduced by Bruna et al in [6], works on exploiting the adjacency matrix or the Laplacian matrix that characterizes the graph structure, and, hence, capturing the correlations among different nodes. Due to its powerful capability in exploring the correlation propagation between nodes, recent years have witnessed increasing research attention from both the natural language processing domain [3,36,42,49] and the computer vision domain [9,20,25,26] has been paid to the graph convolutional network. For example, Kipf and Welling [36] introduced a graph-based semi-supervised learning framework for node classifications, where the label information is smoothed over the graph via a Laplacian regularization term in the loss function.…”
Section: Graph Convolutional Networkmentioning
confidence: 99%