2020
DOI: 10.1609/aaai.v34i05.6318
|View full text |Cite
|
Sign up to set email alerts
|

Infusing Knowledge into the Textual Entailment Task Using Graph Convolutional Networks

Abstract: Textual entailment is a fundamental task in natural language processing. Most approaches for solving this problem use only the textual content present in training data. A few approaches have shown that information from external knowledge sources like knowledge graphs (KGs) can add value, in addition to the textual content, by providing background knowledge that may be critical for a task. However, the proposed models do not fully exploit the information in the usually large and noisy KGs, and it is not clear h… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
17
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 30 publications
(18 citation statements)
references
References 19 publications
1
17
0
Order By: Relevance
“…On the other hand, feeding the commonsense knowledge gradually during the agent's learning process provides more focus to the exploration, and drives it toward the concepts related to the rest of the goals. These results can also be seen as an RL-centric agent-based validation of similar results shown in the broader NLP literature (Kapanipathi et al 2020). We refer the reader to (Murugesan et al 2020) on further discussion on this topic.…”
Section: D2 Discussionsupporting
confidence: 77%
“…On the other hand, feeding the commonsense knowledge gradually during the agent's learning process provides more focus to the exploration, and drives it toward the concepts related to the rest of the goals. These results can also be seen as an RL-centric agent-based validation of similar results shown in the broader NLP literature (Kapanipathi et al 2020). We refer the reader to (Murugesan et al 2020) on further discussion on this topic.…”
Section: D2 Discussionsupporting
confidence: 77%
“…Traditional Attention-Based Models do not utilise contextual representations from PTLMs [7]. KG-Augmented Entailment System (KES) [11] augments the NLI model with external knowledge encoded using graph convolutional networks. Knowledge-based Inference Model (KIM) [3] incorporates lexical-level knowledge (such as synonym and antonym) into its attention and composition components.…”
Section: Related Workmentioning
confidence: 99%
“…BERT model has shown to be robust to adversarial examples when external knowledge is incorporated to the attention mechanism using simple transformations [15]. The KES [11] model highlighted above further evaluates their system with BERT contextual embeddings in the framework.…”
Section: Related Workmentioning
confidence: 99%
“…At the same time, these sense embeddings can also serve as an entry point to many other knowledge bases linked to WordNet, such as the multilingual knowledge graph of BabelNet (Navigli & Ponzetto, 2010), the common-sense triples of ConceptNet (Speer et al, 2017) or WebChild (Tandon et al, 2017), the semantic frames of VerbNet (Schuler, 2006), and even the images of ImageNet (Russakovsky et al, 2015) or Visual Genome (Krishna et al, 2016). Several recent works have used the symbolic relations expressed in these knowledge bases to improve neural solutions to Natural Language Inference (Kapanipathi et al, 2020), Commonsense Reasoning (Lin et al, 2019), Story Generation (Ammanabrolu et al, 2020), among others.…”
Section: Knowledge Integrationmentioning
confidence: 99%