Proceedings of the 39th ACM SIGPLAN Conference on Programming Language Design and Implementation 2018
DOI: 10.1145/3192366.3192410
|View full text |Cite
|
Sign up to set email alerts
|

Accelerating search-based program synthesis using learned probabilistic models

Abstract: A key challenge in program synthesis concerns how to efficiently search for the desired program in the space of possible programs. We propose a general approach to accelerate search-based program synthesis by biasing the search towards likely programs. Our approach targets a standard formulation, syntax-guided synthesis (SyGuS), by extending the grammar of possible programs with a probabilistic model dictating the likelihood of each program. We develop a weighted search algorithm to efficiently enumerate progr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

1
86
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 70 publications
(87 citation statements)
references
References 19 publications
1
86
0
Order By: Relevance
“…In contrast, the Bayesian synthesis framework presented in this paper can synthesize programs with deterministic input-output behavior by adding hard constraints to the Lik semantic function, although alternative synthesis techniques may be required to make the synthesis more effective. Lee et al [2018] present a technique for speeding up program synthesis of non-probabilistic programs by using A* search to enumerate programs that satisfy a set of input-output constraints in order of decreasing prior probability. This prior distribution over programs is itself learned using a probabilistic higher-order grammar with transfer learning over a large corpus of existing synthesis problems and solutions.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…In contrast, the Bayesian synthesis framework presented in this paper can synthesize programs with deterministic input-output behavior by adding hard constraints to the Lik semantic function, although alternative synthesis techniques may be required to make the synthesis more effective. Lee et al [2018] present a technique for speeding up program synthesis of non-probabilistic programs by using A* search to enumerate programs that satisfy a set of input-output constraints in order of decreasing prior probability. This prior distribution over programs is itself learned using a probabilistic higher-order grammar with transfer learning over a large corpus of existing synthesis problems and solutions.…”
Section: Related Workmentioning
confidence: 99%
“…The technique is used to synthesize programs in domain-specific languages for bit-vector, circuit, and string manipulation tasks. Similar to the Bayesian synthesis framework in this paper, Lee et al [2018] use PCFG priors for specifying domain-specific languages. However the fundamental differences are that the synthesized programs in Lee et al [2018] are non-probabilistic and the objective is to enumerate valid programs sorted by their prior probability, while in this paper the synthesized programs are probabilistic so enumeration is impossible and the objective is instead to sample programs according to their posterior probabilities.…”
Section: Related Workmentioning
confidence: 99%
“…There have been several recent successes in applying (supervised) machine learning to programming languages research. For example, machine learning has been used to infer program invariants [Padhi et al 2016;, improve program analysis [Liang et al 2011;Mangal et al 2015;Raghothaman et al 2018;Raychev et al 2015] and synthesis [Balog et al 2016;Feng et al 2018Kalyan et al 2018;Lee et al 2018;Raychev et al 2016b;Schkufza et al 2013Schkufza et al , 2014, build probabilistic models of code [Bielik et al 2016;Raychev et al 2016aRaychev et al , 2014, infer specifications [Bastani et al 2017[Bastani et al , 2018bBeckman and Nori 2011;Bielik et al 2017;Heule et al 2016;Kremenek et al 2006;Livshits et al 2009], test software [Clapp et al 2016;Godefroid et al 2017;Liblit et al 2005], and select lemmas for automated Proof. First, because transitions are deterministic, we have…”
Section: Related Workmentioning
confidence: 99%
“…Neural networks are also used by [Kalyan et al 2018] for synthesis of text processing tasks which improves over prior work by combining statistical and symbolic search approaches and by not requiring hand crafted features. Finally, the work of [Lee et al 2018] uses probabilistic higher order grammar [Bielik et al 2016] that learns to guide A * search to speed-up the synthesis in various domains including bitvectors, circuits and text processing tasks.…”
Section: Related Workmentioning
confidence: 99%
“…As a result, we guide the synthesis by restricting its search space, concretely, by selecting a subset of constraints that are extended if the synthesis fails. In contrast, prior works [Balog et al 2017;Irving et al 2016;Kalyan et al 2018;Lee et al 2018;Long and Rinard 2016;Menon et al 2013] keep the search space unchanged and instead modify the search procedure used to find the solution. This is because while modifying the search procedure of A * or breadth-first search considered in prior works is straightforward, it is challenging to modify the search procedure of the state-of-the-art SMT solvers that already contain number of carefully tuned heuristics and strategies that guide the search.…”
Section: Related Workmentioning
confidence: 99%