2021
DOI: 10.48550/arxiv.2104.02903
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Priori Analysis of Stable Neural Network Solutions to Numerical PDEs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
21
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 12 publications
(21 citation statements)
references
References 33 publications
0
21
0
Order By: Relevance
“…Greedy algorithms for expanding a function f ∈ H as a linear combination of the dictionary elements are fundamental in approximation theory [11,41,40] and signal processing [29,30]. Greedy methods have also been proposed for optimizing shallow neural networks [23,9] and recently such methods have been proposed for training shallow neural networks to solve PDEs [17], which is the approach we take here.…”
Section: Problem Setupmentioning
confidence: 99%
See 3 more Smart Citations
“…Greedy algorithms for expanding a function f ∈ H as a linear combination of the dictionary elements are fundamental in approximation theory [11,41,40] and signal processing [29,30]. Greedy methods have also been proposed for optimizing shallow neural networks [23,9] and recently such methods have been proposed for training shallow neural networks to solve PDEs [17], which is the approach we take here.…”
Section: Problem Setupmentioning
confidence: 99%
“…This algorithm was first introduced and analyzed by Jones [18] for function approximation (i.e. J (u) = u − f 2 H ), and has been extended to the optimization of general convex objectives as well [46,17]. The convergence theorem we will use in our analysis is the following.…”
Section: Relaxed Greedy Algorithmmentioning
confidence: 99%
See 2 more Smart Citations
“…For PINNs, the convergence analysis is provided in [51,62,63], and a PINN with ReLu 3 network is analyzed and the convergence rate was given in C 2 norm ( [35]). The error analysis of DRM was established in [47,68,32,46] via assuming that the exact solution is contained in the spectral Barron space which has the property of being approximated by a two-layer neural network. The convergence rate of DRM with smooth activation functions like logistic or hyperbolic tangent was derived in H 1 norm for elliptic equations ( [20,36]).…”
Section: Introductionmentioning
confidence: 99%