We study approximation problems formulated as regularized minimization problems with kernel-based stabilizers. These approximation schemas exhibit easy derivation of solution to the problem, in the shape of linear combination of kernel functions (one-hidden layer feed-forward neural network schemas). We exploit the article by N. Aronszajn [1] on reproducing kernels and use his formulation of sum of kernels and product of kernels, and resulting kernel space to derive approximation schemas -Sum-Kernel Regularization Network and Product-Kernel Regularization Network. We present some concrete applications of the derived schemas, demonstrate their performance on experiments and compare them to classical solutions.