2019
DOI: 10.31142/ijtsrd23730
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic Question Answer Generator: An Enhanced Approach to Question Generation

Abstract: Teachers and educational institutions seek new questions with different difficulty levels for setting up tests for their students. Also, students long for distinct and new questions to practice for their tests as redundant questions are found everywhere. However, setting up new questions every time is a tedious task for teachers. To overcome this conundrum, we have concocted an artificially intelligent system which generates questions and answers for the mathematical topic-Quadratic equations. The system uses-… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 5 publications
0
4
0
Order By: Relevance
“…Popular research directions in NLG include text summarization, text expansion, text rewriting, question answering, and dialogue systems. Text summarization involves compressing long texts into shorter ones, typically including tasks such as text summarization [5], question generation [6], and distractor generation [7]. Text extension tasks are to generate complete sentences or texts through meaningful words, such as short text expansion [8] and topic writing [9].…”
Section: Natural Language Generationmentioning
confidence: 99%
“…Popular research directions in NLG include text summarization, text expansion, text rewriting, question answering, and dialogue systems. Text summarization involves compressing long texts into shorter ones, typically including tasks such as text summarization [5], question generation [6], and distractor generation [7]. Text extension tasks are to generate complete sentences or texts through meaningful words, such as short text expansion [8] and topic writing [9].…”
Section: Natural Language Generationmentioning
confidence: 99%
“…A further text-to-text generation complication is dividing NLG tasks into three categories, i.e., text abbreviation, text expansion, text rewriting and reasoning. The text abbreviation task is formulated to condense information from long texts to short ones, typically including research on text summarization [6,7,15,17,43,80,99], question generation [4,18,34,36,53,95,104,112,113,130,134], and distractor generation [22,50,60,73,82,86,100,101]. The text expansion tasks, such as short text expansion [5,89,96,106] and topic-to-essay generation [19,81,114,123,129], generate complete sentences or even texts from some meaningful words by considering and adding elements like conjunctions and prepositions to transform the input words into linguistically correct outputs.…”
Section: What Is Natural Language Generation?mentioning
confidence: 99%
“…Text Summarization MASS [99] PLM + Seq2Seq BART [43] PLM + Seq2Seq RCT [6] PLM + RNN + CNN + Seq2Seq + Long-term Dependency ProphetNet [80] PLM + Seq2Seq + Long-term Dependency En-semantic-model [15] RNN + Seq2Seq + Long-term Dependency Post-editing Factual Error Corrector [7] PLM + Seq2Seq + Factual Consistency SpanFact [17] PLM + Seq2Seq + Factual Consistency Question Generation Key-phrase-based Question Generator [18] Keyphrase + Template Dynamic Mathematical Question Generator [4] Constraint Handling Rules KB-based Factoid Question Generator [95] RNN + Seq2Seq Teacher Forcing and RL Based Question Generator [130] RNN + Seq2Seq + RL Paragraph-level Question Generator [134] RNN + Seq2Seq Answer-focused and Position-aware Question Generator [104] RNN + Seq2Seq + Answer-focused ASs2s [36] RNN + Seq2Seq + Answer-focused NQG-MP [113] RNN + Seq2Seq + Multi-task Paraphrase Enhaced Question Generator [34] RNN + Seq2Seq + Multi-task CGC-QG [53] RNN + Seq2Seq + Multi-task + GNN PathQG [112] RNN + Seq2Seq + Multi-task + KG…”
Section: Task Model Descriptionmentioning
confidence: 99%
See 1 more Smart Citation