Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conferen 2019
DOI: 10.18653/v1/d19-1605
|View full text |Cite
|
Sign up to set email alerts
|

Can You Unpack That? Learning to Rewrite Questions-in-Context

Abstract: Question answering is an AI-complete problem, but existing datasets lack key elements of language understanding such as coreference and ellipsis resolution. We consider sequential question answering: multiple questions are asked one-by-one in a conversation between a questioner and an answerer. Answering these questions is only possible through understanding the conversation history. We introduce the task of question-in-context rewriting: given the context of a conversation's history, rewrite a context-depende… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
177
0
1

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 134 publications
(198 citation statements)
references
References 20 publications
0
177
0
1
Order By: Relevance
“…Question Rewriting is a novel, well-defined task which we propose for differentiating distinct answers. To the best of our knowledge, it has not been studied for resolving ambiguity; we are only aware of Elgohary et al (2019) which use question rewriting to convert conversational questions into self-contained questions.…”
Section: Related Workmentioning
confidence: 99%
“…Question Rewriting is a novel, well-defined task which we propose for differentiating distinct answers. To the best of our knowledge, it has not been studied for resolving ambiguity; we are only aware of Elgohary et al (2019) which use question rewriting to convert conversational questions into self-contained questions.…”
Section: Related Workmentioning
confidence: 99%
“…2019a; Pan et al, 2019;Elgohary et al, 2019;. IUR aims to rewrite an incomplete utterance into an utterance which is semantically equivalent but self-contained to be understood without context.…”
Section: Turnmentioning
confidence: 99%
“…Recently, it has raised a large attention in several domains. In question answering, previous works include non-sentential utterance resolution using the sequence to sequence based architecture (Kumar and Joshi, 2016), incomplete follow-up question resolution via a retrieval sequence to sequence model (Kumar and Joshi, 2017) and sequence to sequence model with a copy mechanism (Elgohary et al, 2019;Quan et al, 2019). In conversational semantic parsing, Liu et al (2019b) proposed a novel approach which considers the structures of questions, while Figure 1: The illustration of the word-level edit matrix applied in our formulation.…”
Section: Related Workmentioning
confidence: 99%
“…For instance, Kotov and Zhai (2010) and Sajjad et al (2012) use question templates to generate a list of clarification questions. Elgohary et al (2019) rewrite questions using the dialogue context. invoke graph edit distance for query refinement.…”
Section: Related Workmentioning
confidence: 99%