DOI: 10.1007/978-3-540-87732-5_43
|View full text |Cite
|
Sign up to set email alerts
|

Function Approximation by Neural Networks

Abstract: We investigate a Tikhonov regularization scheme specifically tailored for shallow neural networks within the context of solving a classic inverse problem: approximating an unknown function and its derivatives within a unit cubic domain based on noisy measurements. The proposed Tikhonov regularization scheme incorporates a penalty term that takes three distinct yet intricately related network (semi)norms: the extended Barron norm, the variation norm, and the Radon-BV seminorm. These choices of the penalty term … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 44 publications
0
1
0
Order By: Relevance
“…When appropriate radial basis functions are chosen, spectral convergence rates can be achieved (see e.g. References [10,11]). It can also be shown that the derivatives of the interpolated function can be approximated just as accurately by the derivative of the RBF interpolator.…”
Section: The Model Equations and Their Cut Cell Discretizationmentioning
confidence: 99%
“…When appropriate radial basis functions are chosen, spectral convergence rates can be achieved (see e.g. References [10,11]). It can also be shown that the derivatives of the interpolated function can be approximated just as accurately by the derivative of the RBF interpolator.…”
Section: The Model Equations and Their Cut Cell Discretizationmentioning
confidence: 99%