2012
DOI: 10.1007/s12065-012-0070-y
|View full text |Cite
|
Sign up to set email alerts
|

Kernel representations for evolving continuous functions

Abstract: Abstract. To parameterize continuous functions for evolutionary learning, we use kernel expansions in nested sequences of function spaces of growing complexity. This approach is particularly powerful when dealing with non-convex constraints and discontinuous objective functions. Kernel methods offer a number of beneficial properties for parameterizing continuous functions, such as smoothness and locality, which make them attractive as a basis for mutation operators. Beyond such practical considerations, kernel… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2012
2012
2012
2012

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 31 publications
0
1
0
Order By: Relevance
“…It might be possible to achieve even higher compression by switching to a different basis altogether, such Gaussian kernels (Glasmachers et al, 2011) or wavelets. One potential limitation of a Fourier-type basis is that if the frequency content needs to vary across the matrix, then many coefficients will be required to represent it.…”
Section: Discussion and Future Workmentioning
confidence: 99%
“…It might be possible to achieve even higher compression by switching to a different basis altogether, such Gaussian kernels (Glasmachers et al, 2011) or wavelets. One potential limitation of a Fourier-type basis is that if the frequency content needs to vary across the matrix, then many coefficients will be required to represent it.…”
Section: Discussion and Future Workmentioning
confidence: 99%