Question generation is a promising and important area in natural language processing. According to the information of the given text, the question generation model can automatically generate a variety of questions, which are conducive to processing various subsequent tasks. Chinese question generation is a specific sub-area of question generation. Due to the characteristics of Chinese question generation tasks, many methods are not suitable for it, and the generated results are either incorrect in word order or invalid expression. In order to address such challenging problem, we propose the conditional pre-trained attention model termed A Lite BERT Conditional Question Generation (ALBERT-CQG) for Chinese question generation. Through introducing general background knowledge of the pre-trained model and conditional information of the given answers, this model has capacity of generating more valid expression. To our knowledge, we are the first to apply the conditional pre-trained attention model to Chinese question generation tasks. The experimental results on two known benchmark datasets of Chinese question answering show that the ALBERT-CQG outperforms its recent peers.
K E Y W O R D SChinese question generation, conditional pre-trained attention model, general background knowledge
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.