2021
DOI: 10.1145/3424672
|View full text |Cite
|
Sign up to set email alerts
|

Knowledge Graph Embedding for Link Prediction

Abstract: Knowledge Graphs (KGs) have found many applications in industrial and in academic settings, which in turn, have motivated considerable research efforts towards large-scale information extraction from a variety of sources. Despite such efforts, it is well known that even the largest KGs suffer from incompleteness; Link Prediction (LP) techniques address this issue by identifying missing facts among entities already in the KG. Among the recent LP techniques, those based on KG embeddings h… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
234
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 380 publications
(234 citation statements)
references
References 56 publications
0
234
0
Order By: Relevance
“…Semantic embedding has also been extended to KGs composed of role assertions (Wang et al, 2017). The entities and relations (object properties) are represented in a vector space while retaining their relative relationships (semantics), and the resulting vectors are then applied to downstream tasks including link prediction (Rossi et al, 2020), entity alignment (Sun et al, 2020), and erroneous fact detection and correction (Chen et al, 2020a). One paradigm for learning KG representations is computing the embeddings in an end-to-end manner, iteratively adjusting the vectors using an optimization algorithm to minimize the overall loss across all the triples, where the loss is usually calculated by scoring the truth/ falsity of each triple (positive and negative samples).…”
Section: Semantic Embeddingmentioning
confidence: 99%
See 1 more Smart Citation
“…Semantic embedding has also been extended to KGs composed of role assertions (Wang et al, 2017). The entities and relations (object properties) are represented in a vector space while retaining their relative relationships (semantics), and the resulting vectors are then applied to downstream tasks including link prediction (Rossi et al, 2020), entity alignment (Sun et al, 2020), and erroneous fact detection and correction (Chen et al, 2020a). One paradigm for learning KG representations is computing the embeddings in an end-to-end manner, iteratively adjusting the vectors using an optimization algorithm to minimize the overall loss across all the triples, where the loss is usually calculated by scoring the truth/ falsity of each triple (positive and negative samples).…”
Section: Semantic Embeddingmentioning
confidence: 99%
“…Various kinds of KG embedding algorithms have been proposed and Editors: Nikos Katzouris, Alexander Artikis, Luc De Raedt, Artur d'Avila Garcez, Sebastijan Dumančić, Ute Schmid, Jay Pujara. successfully applied to KG refinement (e.g., link prediction (Rossi et al, 2020) and entity alignment Sun et al 2020), recommendation systems (Ristoski et al, 2019), zero-shot learning (Chen et al, 2020c;Wang et al, 2018), interaction prediction in bioinformatics Myklebust et al, 2019), and so on. However, most of these algorithms focus on creating embeddings for multi-relational graphs composed of RDF (Resource Description Framework) triples such as ⟨England, 1 isPartOf, UK⟩ and ⟨UK,hasCapital,London⟩.…”
Section: Introductionmentioning
confidence: 99%
“…Message queues are often used as message middleware in distributed systems, which have the functions of reducing coupling between modules, enabling asynchronous communication between modules, and reducing system traffic peaks, etc. ey are widely used in e-commerce systems, logging systems, and subscription publishing systems and are also very suitable in certain distributed computing tasks [13]. We analyse the process of knowledge graph construction and analyse the steps and algorithms that are not efficient for single-node processing in the construction process in conjunction with actual projects and select the applicable parallelization techniques to realize its parallelized processing and improve the efficiency of the algorithm without degrading its performance.…”
Section: Status Of Researchmentioning
confidence: 99%
“…Moreover, prediction of missing relations is referred to as relation prediction and prediction of missing entities is called entity prediction. These tasks can be achieved by several KG embedding methods [8]. Such embedding methods first learn the vector representations for entities and relations.…”
Section: Link Prediction Taskmentioning
confidence: 99%
“…Although knowledge graphs are able to model large amounts of data and facts, they still suffer from incompleteness [8] due to missing facts in the literature. Discovering missing facts using existing relationships in the literature is called link prediction.…”
Section: Introductionmentioning
confidence: 99%