2023
DOI: 10.1145/3571234
|View full text |Cite
|
Sign up to set email alerts
|

Top-Down Synthesis for Library Learning

Abstract: This paper introduces corpus-guided top-down synthesis as a mechanism for synthesizing library functions that capture common functionality from a corpus of programs in a domain specific language (DSL). The algorithm builds abstractions directly from initial DSL primitives, using syntactic pattern matching of intermediate abstractions to intelligently prune the search space and guide the algorithm towards abstractions that maximally capture shared structures in the corpus. We present an … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 15 publications
(4 citation statements)
references
References 32 publications
0
4
0
Order By: Relevance
“…These methods relax the context-free assumption used in traditional Bayesian symbolic models, and jointly infers both the posterior over concept 'programs', and a latent library that defines this posterior. This notion of concept libraries is attracting increasing attention across cognitive science and generative AI (Bowers et al 2023;Tian et al 2020;Wang et al 2023;Wong et al 2022).…”
Section: Resource-rational Library Learningmentioning
confidence: 99%
“…These methods relax the context-free assumption used in traditional Bayesian symbolic models, and jointly infers both the posterior over concept 'programs', and a latent library that defines this posterior. This notion of concept libraries is attracting increasing attention across cognitive science and generative AI (Bowers et al 2023;Tian et al 2020;Wang et al 2023;Wong et al 2022).…”
Section: Resource-rational Library Learningmentioning
confidence: 99%
“…As the number of possible refactorings grows combinatorially with program size, we needed a new data structure for representing and manipulating sets of refactorings, which we designed by combining ideas from version space algebras [20][21][22] and equivalence graphs [23] (described in our companion manuscript [2]). Recent work improved upon our original refactoring algorithm by making it more expressive [24] as well as orders of magnitude faster [25].…”
Section: Wake/sleep Program Learningmentioning
confidence: 99%
“…While the GNN of a G-SSNN can be trained with standard gradient-based optimization techniques, its graph structure is a hyperparameter whose values must be explored with search. To conduct this search, I repurpose the cycle of library learning and distributional program search pioneered by the DreamCoder system [Ellis et al, 2021] within an evolutionary framework, incorporating the improved STITCH library learning tool of Bowers et al [2023] and heap search algorithm of Matricon et al [2022]. As in DreamCoder, I perform rounds of library learning and distributional program search, after which I evaluate the current crop of programs on the task.…”
Section: Evolutionary Frameworkmentioning
confidence: 99%