2019
DOI: 10.13164/mendel.2019.1.079
|View full text |Cite
|
Sign up to set email alerts
|

Hybrid Symbolic Regression with the Bison Seeker Algorithm

Abstract: This paper focuses on the use of the Bison Seeker Algorithm (BSA) in a hybrid genetic programming approach for the supervised machine learning method called symbolic regression. While the basic version of symbolic regression optimizes both the model structure and its parameters, the hybrid version can use genetic programming to find the model structure. Consequently, local learning is used to tune model parameters. Such tuning of parameters represents the lifetime adaptation of individuals. This paper aims to … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 15 publications
0
1
0
Order By: Relevance
“…Raidl and Gunther in [11] introduced HGP (hybrid genetic programming), added weights to the top-level tree members and optimized them using a robust least squares method. For example, gradient descent [12,13], simulated annealing combined with the simplex method [14], particle swarm optimization (PSO) [15], multiple regression in the STROGANOFF method [16,17], evolutionary strategies [13,18,19], genetic algorithms [13], self-organizing migrating algorithm (SOMA) [13,20], the Bison Seeker algorithm [21], and non-linear optimization using the Levenberg-Marquardt algorithm [22,23] can be used to optimize the constants. There are many modern approaches for GP optimization.…”
Section: Optimization Of Genetic Programming and Symbolic Regressionmentioning
confidence: 99%
“…Raidl and Gunther in [11] introduced HGP (hybrid genetic programming), added weights to the top-level tree members and optimized them using a robust least squares method. For example, gradient descent [12,13], simulated annealing combined with the simplex method [14], particle swarm optimization (PSO) [15], multiple regression in the STROGANOFF method [16,17], evolutionary strategies [13,18,19], genetic algorithms [13], self-organizing migrating algorithm (SOMA) [13,20], the Bison Seeker algorithm [21], and non-linear optimization using the Levenberg-Marquardt algorithm [22,23] can be used to optimize the constants. There are many modern approaches for GP optimization.…”
Section: Optimization Of Genetic Programming and Symbolic Regressionmentioning
confidence: 99%