2019
DOI: 10.1609/aaai.v33i01.33017410
|View full text |Cite
|
Sign up to set email alerts
|

Combining Axiom Injection and Knowledge Base Completion for Efficient Natural Language Inference

Abstract: In logic-based approaches to reasoning tasks such as Recognizing Textual Entailment (RTE), it is important for a system to have a large amount of knowledge data. However, there is a tradeoff between adding more knowledge data for improved RTE performance and maintaining an efficient RTE system, as such a big database is problematic in terms of the memory usage and computational complexity. In this work, we show the processing time of a state-of-the-art logic-based RTE system can be significantly reduced by rep… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
3

Relationship

1
5

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 10 publications
0
5
0
Order By: Relevance
“…In future work, we will extend our analysis to cover the more complex constructions mentioned in Section 3. We are also considering combining our system with an abduction mechanism that uses large knowledge bases (Yoshikawa et al, 2019) for handling commonsense reasoning with external knowledge.…”
Section: Discussionmentioning
confidence: 99%
“…In future work, we will extend our analysis to cover the more complex constructions mentioned in Section 3. We are also considering combining our system with an abduction mechanism that uses large knowledge bases (Yoshikawa et al, 2019) for handling commonsense reasoning with external knowledge.…”
Section: Discussionmentioning
confidence: 99%
“…Yoshikawa et al [6] proposed an alternative method for extending ccg2lambda by introducing an efficient mechanism for axiom injection based on Knowledge Base Completion (KBC) models [39]. KBC models are machine learning models that have recently seen significant advancements.…”
Section: Related Work a Logic-based Approaches To Rte Tasksmentioning
confidence: 99%
“…This task predicts whether a given premise sentence entails a hypothesis sentence. Logic-based approaches [2], [3], [4], [5], [6] and machine learning approaches [7], [8], [9] are the primary approaches to RTE. Logic-based approaches use logical formulas to represent the linguistic meanings of sentences and try to prove entailment relations between formulas.…”
Section: Introductionmentioning
confidence: 99%
“…The method proposed by Shen et al [33] uses a global memory and a controller module to learn multi-hop paths in vector space and infers missing facts jointly without any human-designed procedure. Yoshikawa et al [43] achieve KBC by combining axiom injection as an inference basis. By constructing rich features, Komninos and Manandhar [47] use a neural network to complete missing relations.…”
Section: Knowledge Base Completionmentioning
confidence: 99%