2020
DOI: 10.1126/sciadv.aay2631
|View full text |Cite
|
Sign up to set email alerts
|

AI Feynman: A physics-inspired method for symbolic regression

Abstract: A core challenge for both physics and artificial intellicence (AI) is symbolic regression: finding a symbolic expression that matches data from an unknown function. Although this problem is likely to be NP-hard in principle, functions of practical interest often exhibit symmetries, separability, compositionality and other simplifying properties. In this spirit, we develop a recursive multidimensional symbolic regression algorithm that combines neural network fitting with a suite of physics-inspired techniques.… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

3
381
0
3

Year Published

2020
2020
2022
2022

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 608 publications
(387 citation statements)
references
References 30 publications
3
381
0
3
Order By: Relevance
“…In AI, there are many attempts to build symbolic regression algorithms, which are automated tools to find the mathematical equation that fits the experimental data [33]. Udrescu and Tegmark [34] developed an algorithm that combines neural network fitting with a set of physics-inspired techniques. They applied it to 100 equations from the Feynman lectures on physics.…”
Section: Proposed Frameworkmentioning
confidence: 99%
“…In AI, there are many attempts to build symbolic regression algorithms, which are automated tools to find the mathematical equation that fits the experimental data [33]. Udrescu and Tegmark [34] developed an algorithm that combines neural network fitting with a set of physics-inspired techniques. They applied it to 100 equations from the Feynman lectures on physics.…”
Section: Proposed Frameworkmentioning
confidence: 99%
“…It has been shown that a good design of the search space is essential in discrete structure optimization problems, e.g., neural architecture search [10][11][12], molecule optimization [13], composite design [14] and symbolic regression [15,16]. Since the QAOA is a well-recognized ansatz for combinatorial problems, we have designed the search space for G A based on gradual modifications of the QAOA ansatz.…”
Section: A Ansatz Architecture Searchmentioning
confidence: 99%
“…However, the functional form of H is usually unknown a priori. Although there are some methods proposed in the literature to identify functional forms from the data, inferring functional forms usually requires a large amount of data and can be computationally expensive [49][50][51][52][53][54]. Instead, we assume H to be an ANN model.…”
Section: Development Of Artificial Neural Network Modelsmentioning
confidence: 99%