Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) 2018
DOI: 10.18653/v1/p18-1200
|View full text |Cite
|
Sign up to set email alerts
|

Interpretable and Compositional Relation Learning by Joint Training with an Autoencoder

Abstract: Embedding models for entities and relations are extremely useful for recovering missing facts in a knowledge base. Intuitively, a relation can be modeled by a matrix mapping entity vectors. However, relations reside on low dimension sub-manifolds in the parameter space of arbitrary matrices -for one reason, composition of two relations M 1 , M 2 may match a third M 3 (e.g. composition of relations currency of country and country of film usually matches currency of film budget), which imposes compositional cons… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
4
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
3
3

Relationship

1
8

Authors

Journals

citations
Cited by 11 publications
(4 citation statements)
references
References 31 publications
0
4
0
Order By: Relevance
“…We train embeddings for 6 million Wikidata entities using feature-specific autoencoders to encode entity features such as names, aliases, description, entity types, and numeric attributes. This approach follows prior work on multimodal KB embeddings (Pezeshkpour et al, 2018) and learning of KB embeddings with autoencoders (Takahashi et al, 2018). Embedding training is detailed in Appendix D.…”
Section: Continuous Representationmentioning
confidence: 99%
“…We train embeddings for 6 million Wikidata entities using feature-specific autoencoders to encode entity features such as names, aliases, description, entity types, and numeric attributes. This approach follows prior work on multimodal KB embeddings (Pezeshkpour et al, 2018) and learning of KB embeddings with autoencoders (Takahashi et al, 2018). Embedding training is detailed in Appendix D.…”
Section: Continuous Representationmentioning
confidence: 99%
“…Recent research has also shown that relation paths between entities in KGs provide richer context information and improve the performance of embedding models for KG completion (Luo et al, 2015;Liang and Forbus, 2015;García-Durán et al, 2015;Guu et al, 2015;Toutanova et al, 2016;Durán and Niepert, 2018;Takahashi et al, 2018;Chen et al, 2018). In particular, Luo et al (2015) constructed relation paths between entities and, viewing entities and relations in the path as pseudo-words, then applied Word2Vec (Mikolov et al, 2013) to produce pre-trained vectors for these pseudo-words.…”
Section: Relation Path-based Embedding Modelsmentioning
confidence: 99%
“…Most of subsequent models focus on integrating various role information brought by differ-ent relational attributes into entity representations for improvement. Lao et al [33][34] and Takahashi et al [59] verify the effectiveness of applying contextual information contained in relationship paths to KBC, demonstrating the value of relational features from another perspective. However, exquisitely designed models with stronger expressivity often come with higher computational costs.…”
Section: Introductionmentioning
confidence: 99%