2021
DOI: 10.1007/s10710-021-09410-y
|View full text |Cite
|
Sign up to set email alerts
|

Constant optimization and feature standardization in multiobjective genetic programming

Abstract: This paper extends the numerical tuning of tree constants in genetic programming (GP) to the multiobjective domain. Using ten real-world benchmark regression datasets and employing Bayesian comparison procedures, we first consider the effects of feature standardization (without constant tuning) and conclude that standardization generally produces lower test errors, but, contrary to other recently published work, we find much less clear trend for tree sizes. In addition, we consider the effects of constant tuni… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2025
2025

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(1 citation statement)
references
References 24 publications
0
1
0
Order By: Relevance
“…• Standardization: Feature standardization drives the values of individual features in the data to have zero mean and zero-unit variance (Liu, 2020). The known computation technique is to decide the mean distribution and each feature's standard deviation and estimate the latest data point by the given formula (Rockett, 2022):…”
Section: Feature Scalingmentioning
confidence: 99%
“…• Standardization: Feature standardization drives the values of individual features in the data to have zero mean and zero-unit variance (Liu, 2020). The known computation technique is to decide the mean distribution and each feature's standard deviation and estimate the latest data point by the given formula (Rockett, 2022):…”
Section: Feature Scalingmentioning
confidence: 99%