2020
DOI: 10.1609/aaai.v34i10.7249
|View full text |Cite
|
Sign up to set email alerts
|

HGMAN: Multi-Hop and Multi-Answer Question Answering Based on Heterogeneous Knowledge Graph (Student Abstract)

Abstract: Multi-hop question answering models based on knowledge graph have been extensively studied. Most existing models predict a single answer with the highest probability by ranking candidate answers. However, they are stuck in predicting all the right answers caused by the ranking method. In this paper, we propose a novel model that converts the ranking of candidate answers into individual predictions for each candidate, named heterogeneous knowledge graph based multi-hop and multi-answer model (HGMAN). HGMAN is c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
1
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(10 citation statements)
references
References 4 publications
0
10
0
Order By: Relevance
“…We choose GNN-based retrieval models as our baselines since they have achieved high performance across different KGQA datasets without additional information (query-answer paths or semantic parses). We experiment with three relevant KGQA retrieval techniques, namely, Embed-KGQA (Saxena et al, 2020), Rel-GCN (Wang et al, 2020a), and GlobalGraph (Wang et al, 2020b). We do not use baselines that require additional textual information to generate the heterogeneous graph, such as GraftNet (Sun et al, 2018) or PullNet (Sun et al, 2019) since this information is not available to us.…”
Section: Baselinesmentioning
confidence: 99%
See 2 more Smart Citations
“…We choose GNN-based retrieval models as our baselines since they have achieved high performance across different KGQA datasets without additional information (query-answer paths or semantic parses). We experiment with three relevant KGQA retrieval techniques, namely, Embed-KGQA (Saxena et al, 2020), Rel-GCN (Wang et al, 2020a), and GlobalGraph (Wang et al, 2020b). We do not use baselines that require additional textual information to generate the heterogeneous graph, such as GraftNet (Sun et al, 2018) or PullNet (Sun et al, 2019) since this information is not available to us.…”
Section: Baselinesmentioning
confidence: 99%
“…Rel-GCN: The Rel-GCN approach of Wang et al (2020a) first constructs a smaller sub-graph K q for a given question, using PPR (Haveliwala, 2003) from the large base knowledge-graph, K. They encode the question q using PTLM as v q , and use TransE (Bordes et al, 2013) on K to obtain the node representations v e i for node e i in K. They concatenate the node embedding with the question-embedding e q , and then perform RGCN on K q to obtain their updated representations. These updated representations are used to score whether a given node is an answer or not.…”
Section: B1 Baselinesmentioning
confidence: 99%
See 1 more Smart Citation
“…Supported with a number of studies on graph representation learning (Kipf and Welling, 2017;Schlichtkrull et al, 2018;Wang et al, 2019a), graph neural network (GNN) shows its powerful ability in graph analysis. A massive number of GNN-based algorithms are designed to perform graph reasoning, such as R- GCN (Schlichtkrull et al, 2018), GRAFT-Net (Sun et al, 2018), HGMAN (Wang et al, 2020) and BAG (Cao et al, 2019), in which nodes update themselves by aggregating the information of neighboring nodes. A node can capture the unconnected node information through multiple GNN layers.…”
Section: Graph Neural Network Based Question Answeringmentioning
confidence: 99%
“…Although these methods prove e ectiveness, their processing steps and conversion process are relatively complex and meanwhile may involve expert knowledge or heuristic rules. Considering that answering multi-hop questions requires searching for reasoning paths, starting from the entity mentioned in question, and consisting of both the relations at each hop and intermediate entities, recent studies [12][13][14] have focused on the power of graph neural networks (GNNs) to solve the above limitations. ey often model the question directly to get a candidate entity graph, and then leverage graph neural network-based information propagation methods to update the node representations in the graph, which are used to choose the answer entities.…”
Section: Introductionmentioning
confidence: 99%