Proceedings of the 28th International Conference on Computational Linguistics 2020
DOI: 10.18653/v1/2020.coling-main.226
|View full text |Cite
|
Sign up to set email alerts
|

Context Dependent Semantic Parsing: A Survey

Abstract: Semantic parsing is the task of translating natural language utterances into machine-readable meaning representations. Currently, most semantic parsing methods are not able to utilize contextual information (e.g. dialogue and comments history), which has a great potential to boost semantic parsing performance. To address this issue, context dependent semantic parsing has recently drawn a lot of attention. In this survey, we investigate progress on the methods for the context dependent semantic parsing, togethe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 13 publications
(7 citation statements)
references
References 42 publications
0
7
0
Order By: Relevance
“…The user questions expressed in natural language are converted into SQL queries which are then executed on our Maritime DeepDive Knowledge Graph [34] to retrieve answers to the questions. The recent surveys [13], [16], [43] cover comprehensive reviews about recent semantic parsing studies. Of particular relevance to this work are BootStrapping Semantic Parsers.…”
Section: Background a Semantic Parsingmentioning
confidence: 99%
See 1 more Smart Citation
“…The user questions expressed in natural language are converted into SQL queries which are then executed on our Maritime DeepDive Knowledge Graph [34] to retrieve answers to the questions. The recent surveys [13], [16], [43] cover comprehensive reviews about recent semantic parsing studies. Of particular relevance to this work are BootStrapping Semantic Parsers.…”
Section: Background a Semantic Parsingmentioning
confidence: 99%
“…Specifically, by designing and constructing an appropriate input sequence of words (called a prompt), one is able to induce the model to produce the desired output sequence (i.e., a paraphrased utterance in this context)without changing its parameters through fine-tuning at all. In particular, in the low-data regime, empirical analysis shows that, either for manually picking hand-crafted prompts [21] or automatically building auto-generated prompts [8], [16] taking prompts for tuning models is surprisingly effective for the knowledge stimulation and model adaptation of pre-trained language model. However, none of these methods reports the use of a few-shot pre-trained language model to directly generate few-shot and zero-shot paraphrased utterances.…”
Section: Paraphrase Generationmentioning
confidence: 99%
“…Dataset XDTS is a sub-task of contextdependent semantic parsing (CDSP) (Suhr et al, 2018;Guo et al, 2019a;Li et al, 2020). Many datasets have been constructed for CDSP.…”
Section: Related Workmentioning
confidence: 99%
“…Context, in this scenario, is broadly understood as information extracted from the set of words occurring around a given word token (Smith, 2020 ; Xia et al, 2020 ). The term context is also used in NLP to refer to information that can be extracted from sentences around the one being analyzed for a given task, such as Natural Language Generation (NLG) for question answering and dialogue systems (Zhou et al, 2016 ) or context-aware semantic parsing (Li et al, 2020b ). Also, identifying and modeling commonsense knowledge is a key aspect of tasks involving Natural Language Understanding (NLU) and Inference (NLI) (LoBue and Yates, 2011 ; Sap et al, 2020 ).…”
Section: Introductionmentioning
confidence: 99%