2020
DOI: 10.1007/978-3-030-46147-8_17
|View full text |Cite
|
Sign up to set email alerts
|

LYRICS: A General Interface Layer to Integrate Logic Inference and Deep Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
27
0
3

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2
2
1

Relationship

1
9

Authors

Journals

citations
Cited by 26 publications
(30 citation statements)
references
References 13 publications
0
27
0
3
Order By: Relevance
“…Silvestri et al [57] shows how adding domain knowledge in the form of symbolic constraints greatly improves the samplingfrequency of a neural network trained to solve a combinatorial problem. The LYRICS system [42] proposes a generic interface layer that allows to define arbitrary first order logic background knowledge, allowing a learning system to learn its weights under the constraints imposed by the prior knowledge.…”
Section: Informed Learning With Prior Knowledgementioning
confidence: 99%
“…Silvestri et al [57] shows how adding domain knowledge in the form of symbolic constraints greatly improves the samplingfrequency of a neural network trained to solve a combinatorial problem. The LYRICS system [42] proposes a generic interface layer that allows to define arbitrary first order logic background knowledge, allowing a learning system to learn its weights under the constraints imposed by the prior knowledge.…”
Section: Informed Learning With Prior Knowledgementioning
confidence: 99%
“…[62] shows how adding domain knowledge in the form of symbolic constraints greatly improves the sampling-frequency of a neural network trained to solve a combinatorial problem. The LYRICS system [47] proposes a generic interface layer that allows to define arbitrary first order logic background knowledge, allowing a learning system to learn its weights under the constraints imposed by the prior knowledge.…”
Section: Informed Learning With Prior Knowledgementioning
confidence: 99%
“…For instance, one can cast constants to vectors, functions terms to vector functions of the corresponding dimensionality, and similarly predicates to tensors of the corresponding aritydimension (Rocktäschel et al 2015;Diligenti et al 2017). Again adopting a fuzzy logic interpretation of the logical connectives, the learning problem can then be cast as a constrained numerical optimization problem, including works such as Logic Tensor Networks (Serafini and d'Avila Garcez 2016) or LYRICS (Marra et al 2019). While the distributed representation of the logical constructs is the subject of learning, in contrast with the discussed Datalog program structure learning approaches, the weight (strength) of each rule needs to be specified apriori --a limitation which was recently addressed in Marra et al (2020).…”
Section: Related Workmentioning
confidence: 99%