2023
DOI: 10.1007/s00365-023-09626-4
|View full text |Cite
|
Sign up to set email alerts
|

Characterization of the Variation Spaces Corresponding to Shallow Neural Networks

Abstract: We study the following two related problems. The first is to determine to what error an arbitrary zonoid in R d+1 can be approximated in the Hausdorff distance by a sum of n line segments. The second is to determine optimal approximation rates in the uniform norm for shallow ReLU k neural networks on their variation spaces. The first of these problems has been solved for d = 2, 3, but when d = 2, 3 a logarithmic gap between the best upper and lower bounds remains. We close this gap, which completes the solutio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(10 citation statements)
references
References 43 publications
0
10
0
Order By: Relevance
“…When compared to the results in [34], our approximation rate is faster and the Barron space in our setting is larger than the spectral Barron space in [34]. As comparing with other approximation results for Barron functions [36,37], the constants in our results are independent of the dimension. • For the DRM, we derive sharper generalization bounds for the Poisson equation and Schrödinger equation with Neumann boundary condition, regardless of whether the solutions fall in Barron spaces or Sobolev spaces.…”
Section: Contributionsmentioning
confidence: 55%
See 1 more Smart Citation
“…When compared to the results in [34], our approximation rate is faster and the Barron space in our setting is larger than the spectral Barron space in [34]. As comparing with other approximation results for Barron functions [36,37], the constants in our results are independent of the dimension. • For the DRM, we derive sharper generalization bounds for the Poisson equation and Schrödinger equation with Neumann boundary condition, regardless of whether the solutions fall in Barron spaces or Sobolev spaces.…”
Section: Contributionsmentioning
confidence: 55%
“…Note that we choose 1-norm in the definition just for simplicity. There are also several different definitions of Barron space [41] and the relationships between them have been studied in [42]. The most important property of functions in the Barron space is that those functions can be efficiently approximated by two-layer neural networks without the curse of dimensionality.…”
Section: Deep Ritz Methodsmentioning
confidence: 99%
“…A number of parameters are generated by the fit net shallow neural network. In summary, a "shallow" neural network is one that generally includes only one hidden layer [13]. As seen in Figure 5.…”
Section: Fit Net Shallow Neural Networkmentioning
confidence: 99%
“…The second norm originates from the convex hull of the dictionary sets. We cite the following definition and refer directly to [32] for further details.…”
Section: Overview Of the Shallow Neural Network Normsmentioning
confidence: 99%
“…Systematic studies on the universality properties of two-layer neural networks, specifically those employing the ReLU activation function, have been conducted in [14], [31], considering various a priori knowledge about the unknown target function. Regarding UAT for the derivatives, a thorough and systematic investigation has been conducted in [31], [33], [32], where the approximation rate about the number of neurons has been comprehensively established. Some attempts have been undertaken to approximate unknown functions using deep neural networks with ReLU activation functions.…”
Section: Introductionmentioning
confidence: 99%