2019
DOI: 10.1007/978-3-030-30484-3_31
|View full text |Cite
|
Sign up to set email alerts
|

A Novel Neural Network-Based Symbolic Regression Method: Neuro-Encoded Expression Programming

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 10 publications
(4 citation statements)
references
References 34 publications
0
4
0
Order By: Relevance
“…Performance on Real-world Problems We validate the performance of OPT-GAN by two real-world problems. One of them is the optimization of neural network-based symbolic regresser: NEEP (Anjum et al 2019), which is a high dimensional optimization problem with very complex landscape. Another is Frequency Modulated Sounds Parameter Identification (FMSPI) problem (Herrera and Lozano 2000), which plays an essential role in modern music, e.g., emulation of acoustic musical instruments.…”
Section: Comparison With Traditional Optimizersmentioning
confidence: 99%
“…Performance on Real-world Problems We validate the performance of OPT-GAN by two real-world problems. One of them is the optimization of neural network-based symbolic regresser: NEEP (Anjum et al 2019), which is a high dimensional optimization problem with very complex landscape. Another is Frequency Modulated Sounds Parameter Identification (FMSPI) problem (Herrera and Lozano 2000), which plays an essential role in modern music, e.g., emulation of acoustic musical instruments.…”
Section: Comparison With Traditional Optimizersmentioning
confidence: 99%
“…Anjum, Sun, Wang, and Orchard [28] propose novel programming constructs to enable the integration of EMIA, an advanced emotion model based on rules, with the 2APL agent language. Through a reconsideration of 2APL's syntax, semantics, and decision-making process, we have successfully integrated them.…”
Section: Agent Programming Languages Extensionsmentioning
confidence: 99%
“…These methods propose to take advantage of the computational capacity of neural networks (Hornik et al, 1989) while providing an interpretable solution. For instance, they offer to encode the expression in the neural network structure and activation functions (Sahoo et al, 2018;Kim et al, 2020), to predict a string expression (Anjum et al, 2019) with Recurrent Neural Networks, or to use Deep Reinforcement Learning (Deep RL) as a search engine (Petersen et al, 2021). By combining partial derivative and neural networks, AI Feynman method (Udrescu and Tegmark, 2020) propose to make use of simplifying properties (such as units, symmetry, separability..…”
Section: Symbolic Regressionmentioning
confidence: 99%