Proceedings of the 2018 2nd International Conference on Artificial Intelligence: Technologies and Applications (ICAITA 2018) 2018
DOI: 10.2991/icaita-18.2018.49
|View full text |Cite
|
Sign up to set email alerts
|

Learning Entity and Relation Embeddings with Entity Description for Knowledge Graph Completion

Abstract: Abstract-With the growth of existing knowledge graph, the completion of knowledge graph has become a crucial problem. In this paper, we propose a novel model based on descriptionembodied knowledge representation learning framework, which is able to take advantages of both fact triples and entity description. Specifically, the relation projection is combined with description-embodied representation learning to learn entity and relation embeddings. Convolutional neural network and TransR are adopted to get the d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 20 publications
(5 citation statements)
references
References 11 publications
0
5
0
Order By: Relevance
“…The KG embedding model embeds all entities and relationships into low-dimensional vectors and captures their semantics [ 27 ]. For each e ∈ ℰ and l ∈ ℒ, KG embeddings models generate e e ∈ ℝ d e and e r ∈ ℝ d r , where e e and e r are d e and d r dimensional vectors, respectively.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The KG embedding model embeds all entities and relationships into low-dimensional vectors and captures their semantics [ 27 ]. For each e ∈ ℰ and l ∈ ℒ, KG embeddings models generate e e ∈ ℝ d e and e r ∈ ℝ d r , where e e and e r are d e and d r dimensional vectors, respectively.…”
Section: Methodsmentioning
confidence: 99%
“…The tail entities of the missing triples are complemented by the score function of the ComplEx model. There was a corresponding score for each completed triple [ 25 , 27 ].…”
Section: Methodsmentioning
confidence: 99%
“…• CKE adopt a heterogeneous network embedding method [18], termed as TransR [19], to extract items' structural representations by considering the heterogeneity of both nodes and relationships. • CFKG propose a knowledge-base representation learning framework to embed heterogeneous entities for recommendation [20].…”
Section: Comparison Methodsmentioning
confidence: 99%
“…TransD [10] projects the head entity and tail entity into different vector spaces, respectively, and constructs the projection matrix dynamically through the projection vector, which greatly saves the amounts of parameters and calculations. TransR [11] believes that entities in different relationships have different semantic spaces, so entities and relationships are embedded in two different spaces; that is, entity space and multiple relationship-specific entity spaces, and the entities are projected into a shared hyperplane through the relationship projection matrix in the TransR model. TransM [12] and TransF [13] optimized the model by changing the evaluation function of TransE, improving its ability to represent complex relationships.…”
Section: Related Workmentioning
confidence: 99%