2021
DOI: 10.48550/arxiv.2103.01719
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Differentiable Inductive Logic Programming for Structured Examples

Abstract: The differentiable implementation of logic yields a seamless combination of symbolic reasoning and deep neural networks. Recent research, which has developed a differentiable framework to learn logic programs from examples, can even acquire reasonable solutions from noisy datasets. However, this framework severely limits expressions for solutions, e.g., no function symbols are allowed, and the shapes of clauses are fixed. As a result, the framework cannot deal with structured examples. Therefore we propose a n… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 25 publications
0
2
0
Order By: Relevance
“…Furthermore, ∂ILP only allows for two atoms per rule and predicates of arity of at most two. Shindo et al [671] improve on ∂ILP by introducing several new algorithms that deal with more complex programs including function symbols. For both frameworks scalability remains an issue.…”
Section: Inductive Logic Programmingmentioning
confidence: 99%
“…Furthermore, ∂ILP only allows for two atoms per rule and predicates of arity of at most two. Shindo et al [671] improve on ∂ILP by introducing several new algorithms that deal with more complex programs including function symbols. For both frameworks scalability remains an issue.…”
Section: Inductive Logic Programmingmentioning
confidence: 99%
“…Moving away from neural networks but attempting to harness gradient descent, we have approaches such as 𝛿-ILP [24] and derivations [65] which implement t-norms to create di erentiable logic programs to nd a suitable hypothesis in an ILP setting. More recently, by directly modelling rule membership of atoms as learnable weights, Neural Logic Networks [53,52] provide a competitive di erentiable ILP system that leverages gradient descent.…”
Section: -22mentioning
confidence: 99%