2021
DOI: 10.48550/arxiv.2105.10334
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Fact-driven Logical Reasoning

Abstract: Logical reasoning, which is closely related to human cognition, is of vital importance in human's understanding of texts. Recent years have witnessed increasing attentions on machine's logical reasoning abilities. However, previous studies commonly apply ad-hoc methods to model pre-defined relation patterns, such as linking named entities, which only considers global knowledge components that are related to commonsense, without local perception of complete facts or events. Such methodology is obviously insuffi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(5 citation statements)
references
References 30 publications
0
5
0
Order By: Relevance
“…But it simply forms a chaintype discourse network and weakens the relations between two distant units. FocalReasoner [21] stresses that fact units in the form of subject-verb-object are significant for logical reasoning. It constructs a supergraph on top of the fact units and updates the node features relying on Graph Neural Network.…”
Section: Logical Reasoningmentioning
confidence: 99%
See 2 more Smart Citations
“…But it simply forms a chaintype discourse network and weakens the relations between two distant units. FocalReasoner [21] stresses that fact units in the form of subject-verb-object are significant for logical reasoning. It constructs a supergraph on top of the fact units and updates the node features relying on Graph Neural Network.…”
Section: Logical Reasoningmentioning
confidence: 99%
“…• DAGN [12]: It proposed a discourse-aware network, which took RoBERTa-Large [19] as the token encoder and employed GNN for the feature update. • FocalReasoner [21]: It focused on the fact units extracted from the text and built a supergraph for the reasoning. Similar to DAGN, it also leveraged RoBERTa-Large and GNN [26] for the token embedding and node update respectively.…”
Section: Datasets and Baselinesmentioning
confidence: 99%
See 1 more Smart Citation
“…He won the World Cup in 1998", the words "Zidane" and "he" refer to the same person. Coreference resolution is used by the Focal Reasoner model (Ouyang et al, 2021) to construct a graph of fact triples, where the same mentions are connected with an undirected edge. In LogiTorch, we implemented a wrapper over a finetuned SpanBERT (Joshi et al, 2020) for coreference resolution.…”
Section: Utilitiesmentioning
confidence: 99%
“…This model is more robust on examples containing negations, and performs better on the negated NLI dataset than the original BERT. Future releases will include newer models such as LReasoner (Huang et al, 2021), Focal Reasoner (Ouyang et al, 2021), AdaLoGN (Li et al, 2022), Logiformer (Xu et al, 2022), and LogiGAN (Pi et al, 2022).…”
Section: Modelsmentioning
confidence: 99%