2018
DOI: 10.1162/tacl_a_00015
|View full text |Cite
|
Sign up to set email alerts
|

Knowledge Completion for Generics using Guided Tensor Factorization

Abstract: Given a knowledge base or KB containing (noisy) facts about common nouns or generics, such as "all trees produce oxygen" or "some animals live in forests", we consider the problem of inferring additional such facts at a precision similar to that of the starting KB. Such KBs capture general knowledge about the world, and are crucial for various applications such as question answering. Different from commonly studied named entity KBs such as Freebase, generics KBs involve quantification, have more complex underl… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
5
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 20 publications
0
5
0
Order By: Relevance
“…Models on the HotpotQA would not be directly applicable to our task and require substantial modification for the following reasons: (i) HotpotQA models are not trained to predict the qualitative structure (more or less of chosen explanation sentences in Figure 1). (ii) HotpotQA involves reasoning over named entities, whereas the current task focuses on common nouns and actions (models that work well on named entities need to be adapted to common nouns and actions (Sedghi and Sabharwal, 2018)). (iii) explanation paragraphs in HotpotQA are not ears less protected → (MORE/+) sound enters the ear → (MORE/+) sound hits ear drum → (MORE/+) more sound detected blood clotting disorder → (LESS/-) blood clots → (LESS/-) scab forms → (MORE/+) less scab formation breathing exercise → (MORE/+) air enters lungs → (MORE/+) air enters windpipe → (MORE/+) oxygen enters bloodstream squirrels store food → (MORE/+) squirrels eat more → (MORE/+) squirrels gain weight → (MORE/+) hard survival in winter less trucks run → (LESS/-) trucks go to refineries → (LESS/-) trucks carry oil → (MORE/+) less fuel in gas stations coal is expensive → (LESS/-) coal burns → (LESS/-) heat produced from coal → (LESS/-) electricity produced legible address → (MORE/+) mailman reads address → (MORE/+) mail reaches destination → (MORE/+) on-time delivery more water to roots → (MORE/+) root attract water → MORE/+) roots suck up water → (LESS/-) plants malnourished in a quiet place → (LESS/-) sound enters the ear → (LESS/-) sound hits ear drum → (LESS/-) more sound detected eagle hungry → (MORE/+) eagle swoops down → (MORE/+) eagle catches mouse → (MORE/+) eagle gets more food Table 1: Examples of our model's predictions on the dev.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Models on the HotpotQA would not be directly applicable to our task and require substantial modification for the following reasons: (i) HotpotQA models are not trained to predict the qualitative structure (more or less of chosen explanation sentences in Figure 1). (ii) HotpotQA involves reasoning over named entities, whereas the current task focuses on common nouns and actions (models that work well on named entities need to be adapted to common nouns and actions (Sedghi and Sabharwal, 2018)). (iii) explanation paragraphs in HotpotQA are not ears less protected → (MORE/+) sound enters the ear → (MORE/+) sound hits ear drum → (MORE/+) more sound detected blood clotting disorder → (LESS/-) blood clots → (LESS/-) scab forms → (MORE/+) less scab formation breathing exercise → (MORE/+) air enters lungs → (MORE/+) air enters windpipe → (MORE/+) oxygen enters bloodstream squirrels store food → (MORE/+) squirrels eat more → (MORE/+) squirrels gain weight → (MORE/+) hard survival in winter less trucks run → (LESS/-) trucks go to refineries → (LESS/-) trucks carry oil → (MORE/+) less fuel in gas stations coal is expensive → (LESS/-) coal burns → (LESS/-) heat produced from coal → (LESS/-) electricity produced legible address → (MORE/+) mailman reads address → (MORE/+) mail reaches destination → (MORE/+) on-time delivery more water to roots → (MORE/+) root attract water → MORE/+) roots suck up water → (LESS/-) plants malnourished in a quiet place → (LESS/-) sound enters the ear → (LESS/-) sound hits ear drum → (LESS/-) more sound detected eagle hungry → (MORE/+) eagle swoops down → (MORE/+) eagle catches mouse → (MORE/+) eagle gets more food Table 1: Examples of our model's predictions on the dev.…”
Section: Related Workmentioning
confidence: 99%
“…Models on the HotpotQA would not be directly applicable to our task and require substantial modification for the following reasons: (i) HotpotQA models are not trained to predict the qualitative structure (more or less of chosen explanation sentences in Figure 1). (ii) HotpotQA involves reasoning over named entities, whereas the current task focuses on common nouns and actions (models that work well on named entities need to be adapted to common nouns and actions (Sedghi and Sabharwal, 2018)). (iii) explanation paragraphs in HotpotQA are not procedural while the current input is procedural in nature with a specific chronological structure.…”
Section: Related Workmentioning
confidence: 99%
“…Augmentation by grounding of the rules The simplest way to incorporate a set of rules in the KG is to augment the KG with their groundings (Sedghi and Sabharwal, 2018) before learning the embedding. Demeester, Rocktäschel, and Riedel (2016) address the computational inefficiency of this approach through lifted rule injection.…”
Section: Related Workmentioning
confidence: 99%
“…A multi-dimensional array or tensor has been a fundamental component for numerous applications including signal processing [ 1 3 ], computer vision [ 4 – 6 ], graph analysis [ 7 , 8 ], and statistics [ 9 ]. Tensor decomposition is a generalization of the matrix decomposition, and plays an important role in latent feature discovery and estimation of unobservable entries [ 10 – 12 ].…”
Section: Introductionmentioning
confidence: 99%