Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Langua 2021
DOI: 10.18653/v1/2021.naacl-main.464
|View full text |Cite
|
Sign up to set email alerts
|

Breadth First Reasoning Graph for Multi-hop Question Answering

Abstract: Recently Graph Neural Network (GNN) has been used as a promising tool in multi-hop question answering task. However, the unnecessary updations and simple edge constructions prevent an accurate answer span extraction in a more direct and interpretable way. In this paper, we propose a novel model of Breadth First Reasoning Graph (BFR-Graph), which presents a new message passing way that better conforms to the reasoning process. In BFR-Graph, the reasoning message is required to start from the question node and p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
21
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 19 publications
(21 citation statements)
references
References 20 publications
0
21
0
Order By: Relevance
“…Zhou et al [14] presented the interpretable reasoning network (INR). The INR model utilizes an interpretable, hop-by-hop reasoning process to answer the question.…”
Section: Related Workmentioning
confidence: 99%
“…Zhou et al [14] presented the interpretable reasoning network (INR). The INR model utilizes an interpretable, hop-by-hop reasoning process to answer the question.…”
Section: Related Workmentioning
confidence: 99%
“…There has been a significant research in the recent years to solve the task. A variety of methods model the task as performing inference over static or dynamic graphs to find the reasoning paths [14,15,30,34,40,61,113,128,142,179,181]. A number of works have also attempted to decompose the multi-hop questions into single hop questions or generate follow-up questions based on the retrieved information [14,95,102,138,181].…”
Section: β™‚ Available Context -B's Father Is C and Her Mother Is Amentioning
confidence: 99%
“…An LSTM is applied to fuse the output of 2 graphs. β€’ Huang and Yang [61] form a sentence graph and add an edge between sentences 𝑠 𝑖 and 𝑠 𝑗 with the weight 𝑀 𝑖 𝑗 given by:…”
Section: Graph-based Techniques -8 Workmentioning
confidence: 99%
See 2 more Smart Citations