2022
DOI: 10.1093/imatrm/tnac001
|View full text |Cite
|
Sign up to set email alerts
|

Error estimates for DeepONets: a deep learning framework in infinite dimensions

Abstract: DeepONets have recently been proposed as a framework for learning nonlinear operators mapping between infinite-dimensional Banach spaces. We analyze DeepONets and prove estimates on the resulting approximation and generalization errors. In particular, we extend the universal approximation property of DeepONets to include measurable mappings in non-compact spaces. By a decomposition of the error into encoding, approximation and reconstruction errors, we prove both lower and upper bounds on the total error, rela… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
95
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1

Relationship

3
4

Authors

Journals

citations
Cited by 92 publications
(96 citation statements)
references
References 49 publications
1
95
0
Order By: Relevance
“…Furthermore, the operator G may not always be continuous. To address these issues, one can follow the recent paper [15], where the authors prove a more general version for the universal approximation of operators for DeepOnets; for any input measure µ, and a Borel measurable operator G, such that…”
Section: Proofmentioning
confidence: 99%
See 3 more Smart Citations
“…Furthermore, the operator G may not always be continuous. To address these issues, one can follow the recent paper [15], where the authors prove a more general version for the universal approximation of operators for DeepOnets; for any input measure µ, and a Borel measurable operator G, such that…”
Section: Proofmentioning
confidence: 99%
“…More recently, the authors of [21] have proposed using deep, instead of shallow, neural networks in both the trunk and branch net and have christened the resulting architecture as a DeepOnet. In a recent article [15], the universal approximation property of DeepOnets was extended, making it completely analogous to universal approximation results for finite-dimensional functions by neural networks. The authors of [15] were also able to show that DeepOnets can break the curse of dimensionality for a large variety of PDE learning tasks.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…A very incomplete list of examples where deep learning is used for the numerical solutions of differential equations includes the solution of high-dimensional linear and semi-linear parabolic partial differential equations [9,13] and references therein, and for many-query problems such as those arising in uncertainty quantification (UQ), PDE constrained optimization and (Bayesian) inverse problems. Such problems can be recast as parametric partial differential equations and the use of deep neural networks in their solution is explored for elliptic and parabolic PDEs in [22,43], for transport PDEs [24] and for hyperbolic and related PDEs [6,[34][35][36], and as operator learning frameworks in [2,28,30,32] and references therein. All the afore-mentioned methods are of the supervised learning type [12] i.e., the underlying deep neural networks have to be trained on data, either available from measurements or generated by numerical simulations.…”
Section: Introductionmentioning
confidence: 99%