Proceedings of the 2019 Conference of the North 2019
DOI: 10.18653/v1/n19-1299
|View full text |Cite
|
Sign up to set email alerts
|

Bidirectional Attentive Memory Networks for Question Answering over Knowledge Bases

Abstract: When answering natural language questions over knowledge bases (KBs), different question components and KB aspects play different roles. However, most existing embeddingbased methods for knowledge base question answering (KBQA) ignore the subtle interrelationships between the question and the KB (e.g., entity types, relation paths and context). In this work, we propose to directly model the two-way flow of interactions between the questions and the KB via a novel Bidirectional Attentive Memory Network, called … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
65
0
2

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 105 publications
(67 citation statements)
references
References 39 publications
0
65
0
2
Order By: Relevance
“…SP-based Yavuz et al [20] 0.516 Bao et al [48] 0.524 Yih et al [19] 0.525 IR-based Hao et al [15] 0.429 Xu et al [4] 0.471 BAMnet and our method BAMnet 0.557 Our method 0.563…”
Section: Methods (Baseline)mentioning
confidence: 98%
See 2 more Smart Citations
“…SP-based Yavuz et al [20] 0.516 Bao et al [48] 0.524 Yih et al [19] 0.525 IR-based Hao et al [15] 0.429 Xu et al [4] 0.471 BAMnet and our method BAMnet 0.557 Our method 0.563…”
Section: Methods (Baseline)mentioning
confidence: 98%
“…Most embedding-based approaches encode questions and answers independently. Hao et al [15] proposed a cross-attention mechanism to encode questions according to various candidate answer aspects. Chen, Y et al [17] goes one step further by modeling the bidirectional interactions between questions and a KB.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Early neural based QA systems (Kato et al, 2017;Loginova and Neumann, 2018;Chen et al, 2019) are often based on QALSTM models (Tan et al, 2016) with self-attention mechanism (Lin et al, 2017) in order to visualize and illuminate the inner workings of a specific LSTM. As recent advances of Transformer (Vaswani et al, 2017) and BERT (Devlin et al, 2019) achieved superior performance on many NLP tasks in general domains, Transformer-based QA systems (Ma et al, 2019) and fine-tuned BERT QA systems Yilmaz et al, 2019) have been deployed to better retrieve answers.…”
Section: Question Answering Systemmentioning
confidence: 99%
“…They transform a natural language question into a formal query, for example, a SPARQL query, which in turn is executed over a knowledge base (KB) such as Freebase to retrieve answers. State-ofthe-art methods (Hao et al 2018;Mohammed, Shi, and Lin 2018;Wang et al 2018;Chen, Wu, and Zaki 2019) have achieved promising results on simple questions that are represented as a formal query with a single predicate, for example, "who is the wife of Obama?" However, difficulties are faced when processing complex questions that correspond to a formal query with multiple predicates (Talmor and Berant 2018b), for example, "what movie that Miley Cyrus acted in had a director named Tom Vaughan?"…”
Section: Introductionmentioning
confidence: 99%