2022
DOI: 10.48550/arxiv.2202.09791
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Contextual Semantic Embeddings for Ontology Subsumption Prediction

Abstract: Automating ontology curation is a crucial task in knowledge engineering. Prediction by machine learning techniques such as semantic embedding is a promising direction, but the relevant research is still preliminary. In this paper, we present a class subsumption prediction method named BERTSubs, which uses the pre-trained language model BERT to compute contextual embeddings of the class labels and customized input templates to incorporate contexts of surrounding classes. The evaluation on two large-scale real-w… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(10 citation statements)
references
References 11 publications
0
10
0
Order By: Relevance
“…Such vectors can then be used as additional inputs to machine learning models based on neural networks, to improve their performance by allowing them to use an approximation of expert knowledge [87]. A good method of learning embeddings from symbolic knowledge should ideally leverage the structural information present in relations between abstract concepts, and should not try to learn embeddings for abstract concepts with the aid of word embeddings, due to the ambiguity of language, its limited abstraction, and other problems [11].…”
Section: Knowledge Base Embeddingsmentioning
confidence: 99%
See 2 more Smart Citations
“…Such vectors can then be used as additional inputs to machine learning models based on neural networks, to improve their performance by allowing them to use an approximation of expert knowledge [87]. A good method of learning embeddings from symbolic knowledge should ideally leverage the structural information present in relations between abstract concepts, and should not try to learn embeddings for abstract concepts with the aid of word embeddings, due to the ambiguity of language, its limited abstraction, and other problems [11].…”
Section: Knowledge Base Embeddingsmentioning
confidence: 99%
“…In general, concept embeddings are represented by vectors in real vector space R N e , where N e is the embedding dimension. In Equations ( 5)- (11) we define the recursive function h K : ALC → R N e that maps ALC concepts for knowledge base K to embedding vectors in real vector space R N e . We begin with the embedding for each concept name A i : a vector W K,A i of dimension N e , as given by Equation (5).…”
Section: Embedding Layermentioning
confidence: 99%
See 1 more Smart Citation
“…BERTSubs with Isolated Class (IC). BERTSubs with the IC setting [4] has the same architecture as BERTMap, but it fine-tunes the BERT model by the declared subsumptions in the two ontologies for matching. The current results are based on the labels defined by rdfs:label.…”
Section: Subsumption Matchingmentioning
confidence: 99%
“…However, there are often more subsumption mappings than equivalence mappings between real-world ontologies, and the former could play an important role in knowledge integration and ontology curation. With the blooming research and application of ML and text understanding techniques, systems for subsumption matching (e.g., BERTSubs [4]) will likely become more feasible and widely investigated. 4.…”
Section: Introductionmentioning
confidence: 99%