2016
DOI: 10.48550/arxiv.1611.04766
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Differentiable Genetic Programming

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 0 publications
0
4
0
Order By: Relevance
“…While probabilistic programming facilitates generic computations involving probability distributions, differentiable programming ( Izzo et al., 2016 ; Innes et al., 2019 ) extends computation by enabling differentiation of arbitrary computer programs ( Figure 1G ). This empowers the fine-tuning of program behavior using gradient-based optimization techniques.…”
Section: Nine Simulation Intelligence Motifs For Plant Sciencementioning
confidence: 99%
“…While probabilistic programming facilitates generic computations involving probability distributions, differentiable programming ( Izzo et al., 2016 ; Innes et al., 2019 ) extends computation by enabling differentiation of arbitrary computer programs ( Figure 1G ). This empowers the fine-tuning of program behavior using gradient-based optimization techniques.…”
Section: Nine Simulation Intelligence Motifs For Plant Sciencementioning
confidence: 99%
“…We can now use symbolic regression [30,31] to automatically build a mathematical expression q sr (x, y) that approximately reproduces the machine learned optimal RC qopt ANN (x|w). Symbolic regression finds the best fit to a given data set searching both model and parameter space by genetic programming, evolving combinations of elementary functions and input variables through random mutations and survival of the fittest [31].…”
Section: Algorithmmentioning
confidence: 99%
“…We can now use symbolic regression [30,31] to automatically build a mathematical expression q sr (x, y) that approximately reproduces the machine learned optimal RC qopt ANN (x|w). Symbolic regression finds the best fit to a given data set searching both model and parameter space by genetic programming, evolving combinations of elementary functions and input variables through random mutations and survival of the fittest [31]. Applied to our model system with input variables (x, y), symbolic regression produces a simplified RC, q sr (x, y) = (3.32x + 1.24y) exp (2.92xy), which corresponds to a committor in excellent agreement with the exact one (Figure 3).…”
Section: Algorithmmentioning
confidence: 99%
See 1 more Smart Citation