2020
DOI: 10.1613/jair.1.11944
|View full text |Cite
|
Sign up to set email alerts
|

TensorLog: A Probabilistic Database Implemented Using Deep-Learning Infrastructure

Abstract: We present an implementation of a probabilistic first-order logic called TensorLog, in which classes of logical queries are compiled into differentiable functions in a neural-network infrastructure such as Tensorflow or Theano. This leads to a close integration of probabilistic logical reasoning with deep-learning infrastructure: in particular, it enables high-performance deep learning frameworks to be used for tuning the parameters of a probabilistic logic. The integration with these frameworks enables use of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
36
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 42 publications
(40 citation statements)
references
References 24 publications
0
36
0
Order By: Relevance
“…These systems identify classes of logical queries that can be compiled into differentiable functions in a neural network infrastructure. In this space we have Tensor Logic Networks (TLNs) (Donadello et al, 2017) and TensorLog (Cohen et al, 2020). Symbols are represented as row vectors in a parameter matrix.…”
Section: Hybrid Neural-symbolic Approachesmentioning
confidence: 99%
See 3 more Smart Citations
“…These systems identify classes of logical queries that can be compiled into differentiable functions in a neural network infrastructure. In this space we have Tensor Logic Networks (TLNs) (Donadello et al, 2017) and TensorLog (Cohen et al, 2020). Symbols are represented as row vectors in a parameter matrix.…”
Section: Hybrid Neural-symbolic Approachesmentioning
confidence: 99%
“…Probabilistic Logics: We compare to PSL (Bach et al, 2017), a purely symbolic probabilistic logic, and TensorLog (Cohen et al, 2020), a neuro-symbolic one. In both cases, we instantiate the program using the weights learned with our base encoders.…”
Section: Baselinesmentioning
confidence: 99%
See 2 more Smart Citations
“…Our work is related to all the works in Neuro-Symbolic reasoning (Serafini and Garcez, 2016;Cohen et al, 2020;Rocktäschel and Riedel, 2017;Kazemi and Poole, 2018;Aspis et al, 2018;Ebrahimi et al, 2018;Evans and Grefenstette, 2018) that aims at implementing a symbolic theorem prover with Neural Networks. These works provides proof that more complicated symbolic reasoning algorithms than the one used in this work, can be implemented using neural nets.…”
Section: Related Workmentioning
confidence: 99%