2020
DOI: 10.1007/978-3-030-58115-2_3
|View full text |Cite
|
Sign up to set email alerts
|

Program Synthesis in a Continuous Space Using Grammars and Variational Autoencoders

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(9 citation statements)
references
References 26 publications
0
9
0
Order By: Relevance
“…Today in the Deep Learning community, new approaches propose, for example, to encode the equation in the neural network structure and activation functions [14,15], to use Recurrent Neural Networks to predict a string equation [16] or use Deep Reinforcement Learning (RL) as a search engine [17]. However, those methods are often data-hungry, and few of them manage to include prior knowledge through sophisticated constraints, such as those included by context-free grammars [18]. Regarding RL approaches, SR tasks use sparse delayed rewards, as the metric is only evaluated at the end of an episode [17].…”
Section: Symbolic Regressionmentioning
confidence: 99%
“…Today in the Deep Learning community, new approaches propose, for example, to encode the equation in the neural network structure and activation functions [14,15], to use Recurrent Neural Networks to predict a string equation [16] or use Deep Reinforcement Learning (RL) as a search engine [17]. However, those methods are often data-hungry, and few of them manage to include prior knowledge through sophisticated constraints, such as those included by context-free grammars [18]. Regarding RL approaches, SR tasks use sparse delayed rewards, as the metric is only evaluated at the end of an episode [17].…”
Section: Symbolic Regressionmentioning
confidence: 99%
“…Program synthesis methods that have been applied to PSB1 have used varying methods for constraining the instruction set and other program syntax. For example, some have used grammars [6,7,22,27,32,39,50] while others have used data-type categorized subsets of an instruction set [16,17]. We do not want to constrain what a reasonable approach to selecting instructions may look like for any given program synthesis system.…”
Section: Using Psb2mentioning
confidence: 99%
“…In addition to the previously mentioned approaches stackbased GP, grammar-guided GP, and linear GP, we also identified a paper by Lynch et al [72] that proposes an approach that could be a relevant direction for future program synthesis research. The authors use a variational autoencoder [73] to learn the representation of programs which are sampled using a context-free grammar definition.…”
Section: Linear Gp and Further Approachesmentioning
confidence: 99%
“…Last Index of Zero [16], [29], [74], [48], [39], [75], [50], [51], [76], [52], [40], [53], [42], [41], [80], [45], [38] [11], [61], [12], [2], [63], [72] [72] 24…”
Section: Benchmark Problem Stack-based Gp Grammar-guided Gp Linear Gp...mentioning
confidence: 99%
See 1 more Smart Citation