2019
DOI: 10.1007/978-3-030-23887-2_15
|View full text |Cite
|
Sign up to set email alerts
|

An Ontology-Based Deep Learning Approach for Knowledge Graph Completion with Fresh Entities

Abstract: This paper introduces a new initialization method for knowledge graph (KG) embedding that can leverage ontological information in knowledge graph completion problems, such as link classification and link prediction. Although the initialization method is general and applicable to different KG embedding approaches in the literature, such as TransE or RESCAL, this paper experiments with deep learning and specifically with the neural tensor network (NTN) model. The experimental results show that the proposed metho… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(7 citation statements)
references
References 5 publications
0
7
0
Order By: Relevance
“…In fact, ontology describes the basic framework of people's cognition of a domain, and, in contrast, a knowledge graph is rich in entities and relation instances. So far, ontology has been widely used in the research of knowledge graphs [31][32][33][34]. As noted by Paulheim [35], ontology is mainly involved in the construction and data introduction phases of KGs.…”
Section: Classical Research Limitationsmentioning
confidence: 99%
“…In fact, ontology describes the basic framework of people's cognition of a domain, and, in contrast, a knowledge graph is rich in entities and relation instances. So far, ontology has been widely used in the research of knowledge graphs [31][32][33][34]. As noted by Paulheim [35], ontology is mainly involved in the construction and data introduction phases of KGs.…”
Section: Classical Research Limitationsmentioning
confidence: 99%
“…Under the matrix factorization framework, the text features of nodes are introduced into network representation learning. The Context-aware Embedding (CANE) model [118] uses the text information of network nodes to interpret the relationship between nodes, and learns context-related network representations for network nodes according to different neighbors. Literature [119] uses ontology information to improve the effect of link classification and link prediction.…”
Section: ) Application Of Semantic Information In the Knowledge Grapmentioning
confidence: 99%
“…It has been observed as advantageous to transform categorical variables using suitable feature engineering before applying neural network [35]. For this, we used one-hot encoding, a robust feature engineering scheme, for generating the suitable feature vector indices [16]. These categorical features are mapped into sensor state vector indices representing the concurrent sensor activation patterns for a particular activity.…”
Section: One-hot Code Vectorizationmentioning
confidence: 99%
“…In addition to separation logic, use of a query language, SPARQL also provides support for disengaging these semantics and assertions for interpreting any rule-based complex activities [15]. In work by Amador et al [16], the authors used SPARQL for retrieving class entities and their types, which were later transformed into vector form before using deep learning approaches. Similarly, Socher et al [17] have bridged neural networks with an ontological knowledge-base for the identification of additional facts.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation