2022
DOI: 10.48550/arxiv.2201.00217
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Deep Nonparametric Estimation of Operators between Infinite Dimensional Spaces

Abstract: Learning operators between infinitely dimensional spaces is an important learning task arising in wide applications in machine learning, imaging science, mathematical modeling and simulations, etc. This paper studies the nonparametric estimation of Lipschitz operators using deep neural networks. Non-asymptotic upper bounds are derived for the generalization error of the empirical risk minimizer over a properly chosen network class. Under the assumption that the target operator exhibits a low dimensional struct… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

1
11
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(12 citation statements)
references
References 67 publications
1
11
0
Order By: Relevance
“…In particular, Example 6 shows that our static universal approximation result for our architecture can approximate more general output spaces than the DeepONets of [105,101]. Further, our results yield quantitative counterparts to the Banach space-valued results of [23]'s qualitative universal approximation results for their feedforward-type architecture.…”
supporting
confidence: 54%
See 1 more Smart Citation
“…In particular, Example 6 shows that our static universal approximation result for our architecture can approximate more general output spaces than the DeepONets of [105,101]. Further, our results yield quantitative counterparts to the Banach space-valued results of [23]'s qualitative universal approximation results for their feedforward-type architecture.…”
supporting
confidence: 54%
“…Recently, various deep learning models mapping into infinite dimensional Banach spaces have been proposed such as the DeepONets of [105,101], the Fourier Neural Operators of [88], and the generalized feedforward model of [23] generalized to Fréchet spaces (in the special case where the space of processes is a Martingale-Hardy space or a similar linear space of semi-Martingales, see [38,Part IV]). Our models can cover approximation of functions taking values in Banach spaces, and, more generally in Fréchet spaces.…”
mentioning
confidence: 99%
“…We note that every compact set of functions on R q has a non-linear width, depending necessarily on q, usually increasing with q. In [21], the authors give statistical estimates on the error in approximation in the presence of noisy data for the solution of judiciously formulated optimization problems involved in training the networks used for approximation.…”
Section: Related Workmentioning
confidence: 99%
“…( 1) for any given function pair g 1 , g 2 without learning again. A few methods have been proposed on learning operators by neural networks for solving PDEs, such as DeepONet [12], DeepGreen [13], Neural Operator [14,15], MOD-Net [16], and the deep learning-based nonparametric estimation [17]. The DeepGreen, Neural Operator and MOD-Net methods are based on Green's functions for solving PDEs, i.e., these methods learn the Green's function instead of learning the operator directly.…”
Section: Introductionmentioning
confidence: 99%
“…The DeepONet [12] is a method that learns nonlinear operators associated with PDEs from data based on the approximation theorem for operators by neural networks [18]. The method in [17] is to learn the operator by model reduction [19] of reducing the operator to a finite dimensional space. Most of these available works on learning operators focused on the development of algorithms.…”
Section: Introductionmentioning
confidence: 99%