2006
DOI: 10.1007/11785231_7
|View full text |Cite
|
Sign up to set email alerts
|

Sum and Product Kernel Regularization Networks

Abstract: We study approximation problems formulated as regularized minimization problems with kernel-based stabilizers. These approximation schemas exhibit easy derivation of solution to the problem, in the shape of linear combination of kernel functions (one-hidden layer feed-forward neural network schemas). We exploit the article by N. Aronszajn [1] on reproducing kernels and use his formulation of sum of kernels and product of kernels, and resulting kernel space to derive approximation schemas -Sum-Kernel Regulariza… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2010
2010
2021
2021

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(1 citation statement)
references
References 8 publications
0
1
0
Order By: Relevance
“…Based on Aronszajn's breakthrough results [32], we have shown that it is possible to use composite kernels as activation functions in the RNs (cf. [33]). Such composite kernels also often outperform a simple kernel function.…”
Section: Composite Kernelsmentioning
confidence: 99%
“…Based on Aronszajn's breakthrough results [32], we have shown that it is possible to use composite kernels as activation functions in the RNs (cf. [33]). Such composite kernels also often outperform a simple kernel function.…”
Section: Composite Kernelsmentioning
confidence: 99%