2021
DOI: 10.48550/arxiv.2107.11864
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Neural Circuit Synthesis from Specification Patterns

Frederik Schmitt,
Christopher Hahn,
Markus N. Rabe
et al.

Abstract: We train hierarchical Transformers on the task of synthesizing hardware circuits directly out of high-level logical specifications in linear-time temporal logic (LTL). The LTL synthesis problem is a well-known algorithmic challenge with a long history and an annual competition is organized to track the improvement of algorithms and tooling over time. New approaches using machine learning might open a lot of possibilities in this area, but suffer from the lack of sufficient amounts of training data. In this pap… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 26 publications
0
3
0
Order By: Relevance
“…A simplified NeuroSAT architecture was trained for unsat-core predictions (Selsam & Bjørner, 2019). Neural networks have been applied to 2QBF (Lederman et al, 2020), logical entailment (Evans et al, 2018), SMT (Balunovic et al, 2018), and temporal logics (Hahn et al, 2020;Schmitt et al, 2021).…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…A simplified NeuroSAT architecture was trained for unsat-core predictions (Selsam & Bjørner, 2019). Neural networks have been applied to 2QBF (Lederman et al, 2020), logical entailment (Evans et al, 2018), SMT (Balunovic et al, 2018), and temporal logics (Hahn et al, 2020;Schmitt et al, 2021).…”
Section: Related Workmentioning
confidence: 99%
“…Deep learning is on the verge of transitioning from traditional application domains, like image recognition (He et al, 2015), face recognition (Taigman et al, 2014), or translation (Wu et al, 2016), to domains that involve complex symbolic reasoning tasks. Examples include the application of deep neural network architectures to SAT (Selsam et al, 2019;Selsam & Bjørner, 2019;Ozolins et al, 2021), SMT (Balunovic et al, 2018), temporal specifications in verification (Hahn et al, 2020;Schmitt et al, 2021), symbolic mathematics (Lample & Charton, 2020), or theorem proving (Loos et al, 2017;Bansal et al, 2019;Huang et al, 2019;Urban & Jakubuv, 2020).…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation