Proceedings of the 29th ACM International Conference on Information &Amp; Knowledge Management 2020
DOI: 10.1145/3340531.3411974
|View full text |Cite
|
Sign up to set email alerts
|

Schema2QA: High-Quality and Low-Cost Q&A Agents for the Structured Web

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
1

Relationship

3
4

Authors

Journals

citations
Cited by 14 publications
(22 citation statements)
references
References 17 publications
0
22
0
Order By: Relevance
“…Empirical results on benchmark datasets demonstrate the superiority of our approach in both language modeling and auto-regressive generation. While the proposed method focuses on the fundamental text modeling, it is also promising to extend to more applications such as machine translation [36]- [40], question answering [41], log generation [42], [43], and code generation [44], [45].…”
Section: Discussionmentioning
confidence: 99%
“…Empirical results on benchmark datasets demonstrate the superiority of our approach in both language modeling and auto-regressive generation. While the proposed method focuses on the fundamental text modeling, it is also promising to extend to more applications such as machine translation [36]- [40], question answering [41], log generation [42], [43], and code generation [44], [45].…”
Section: Discussionmentioning
confidence: 99%
“…This paper presents SPL, a toolkit and methodology to extend and localize semantic parsers to a new language with higher accuracy, yet at a fraction of the cost compared to previous methods. SPL was incorporated into the Schema2QA toolkit (Xu et al, 2020a) to give it a multilingual capability.…”
Section: Discussionmentioning
confidence: 99%
“…Only a couple of hundred of sentences need to be translated manually; no manual annotation of sentences is necessary. • An improved neural semantic parsing model, based on BERT-LSTM (Xu et al, 2020a) but using the XLM-R encoder. Its applicability extends beyond multilingual semantic parsing task, as it can be deployed for any NLP task that can be framed as sequence-to-sequence.…”
Section: Cross-attention Weightsmentioning
confidence: 99%
See 1 more Smart Citation
“…To translate natural language instructions into ThingTalk, we use the previously proposed BERT-LSTM model (Xu et al, 2020). BERT-LSTM is an encoder-decoder network that uses a pretrained BERT encoder (Devlin et al, 2019) and LSTM (Hochreiter and Schmidhuber, 1997) decoder with a pointer-generator (See et al, 2017;Paulus et al, 2018).…”
Section: Semantic Parser Modelmentioning
confidence: 99%