Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence 2019
DOI: 10.24963/ijcai.2019/845
|View full text |Cite
|
Sign up to set email alerts
|

EL Embeddings: Geometric Construction of Models for the Description Logic EL++

Abstract: An embedding is a function that maps entities from one algebraic structure into another while preserving certain characteristics. Embeddings are being used successfully for mapping relational data or text into vector spaces where they can be used for machine learning, similarity search, or similar tasks. We address the problem of finding vector space embeddings for theories in the Description Logic EL ++ that are also models of the TBox. To find such embeddings, we define an optimization problem that character… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
88
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
2
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 69 publications
(89 citation statements)
references
References 0 publications
1
88
0
Order By: Relevance
“…Table 2 concludes about the key characteristics of sim , compared to the others in logic-based methods. Symbolic ⇒ Neuro Virtualization [10] Symbolic ⇒ Neuro Virtualization [19] Symbolic ⇒ Neuro Virtualization [11] Symbolic ⇒ Neuro Virtualization [12] Symbolic ⇒ Neuro Virtualization [29] Symbolic ⇒ Neuro Virtualization [30] Symbolic ⇒ Neuro Virtualization [31] Symbolic ⇒ Neuro Virtualization sim Neuro ⇒ Symbolic Formal semantics (symbolic part) and virtualization (neuro part) Description tree…”
Section: A On the Viewpoint Of Ontology Similaritymentioning
confidence: 99%
See 1 more Smart Citation
“…Table 2 concludes about the key characteristics of sim , compared to the others in logic-based methods. Symbolic ⇒ Neuro Virtualization [10] Symbolic ⇒ Neuro Virtualization [19] Symbolic ⇒ Neuro Virtualization [11] Symbolic ⇒ Neuro Virtualization [12] Symbolic ⇒ Neuro Virtualization [29] Symbolic ⇒ Neuro Virtualization [30] Symbolic ⇒ Neuro Virtualization [31] Symbolic ⇒ Neuro Virtualization sim Neuro ⇒ Symbolic Formal semantics (symbolic part) and virtualization (neuro part) Description tree…”
Section: A On the Viewpoint Of Ontology Similaritymentioning
confidence: 99%
“…Finally, embeddings can be trained subject to the interpretation function I and preserve semantics defined for each logical constructs of DLs. For intance, Hoehndorf et al [31] specifically formalized loss functions corresponding to the constructs provided by the logic EL ++ , i.e., the constructors intersection, existential quantifiers, and bottom.…”
Section: B On the Viewpoint Of Neuro-symbolismmentioning
confidence: 99%
“…An early example is (d'Amato et al 2009), which proposed to use a similarity metric between individuals to find plausible answers to queries. More recently, Bouraoui and Schockaert (2018) proposed a method for finding plausible missing ABox assertions, by representing each concept as a Gaussian distribution in a vector space, while Kulmanov et al (2019) proposed a method to learn a vector space embedding of EL ontologies for this purpose. The problem of completing TBoxes using vector space representations was considered in (Bouraoui and Schockaert 2019).…”
Section: Related Workmentioning
confidence: 99%
“…Embedding the classes, relations, and instances in ontologies can provide useful features for predictive models that rely on background knowledge, and these embeddings can incorporate ontology axioms as well as natural language annotations such as labels and definitions (Kulmanov et al, 2019;Liu-Wei et al, 2019;Althubaiti et al, 2019;Smaili et al, 2018a). However, using the natural language information in ontologies can also add noise, in particular when labels or descriptions use complex terms, such as chemical formulas, which are not easy to recognize in natural language text (Smaili et al, 2019).…”
Section: Ontology-based Normalization Of Natural Languagementioning
confidence: 99%
“…We apply these normalization methods so that we can utilize literature and ontologies jointly in machine learning models. Ontology-based machine learning methods mainly use the formal axioms and entity-class annotations without considering natural language information (Smaili et al, 2018a;Kulmanov et al, 2019;Holter et al, 2019). At the same time, there are a number of methods which apply learning on literature directly to perform biomedical analysis and prediction tasks (Kim et al, 2019;Naeem et al, 2010;Wong and Shatkay, 2013).…”
Section: Introductionmentioning
confidence: 99%