2020
DOI: 10.1016/j.jcp.2020.109309
|View full text |Cite
|
Sign up to set email alerts
|

Meta-learning pseudo-differential operators with deep neural networks

Abstract: This paper introduces a meta-learning approach for parameterized pseudo-differential operators with deep neural networks. With the help of the nonstandard wavelet form, the pseudo-differential operators can be approximated in a compressed form with a collection of vectors. The nonlinear map from the parameter to this collection of vectors and the wavelet transform are learned together from a small number of matrix-vector multiplications of the pseudo-differential operator. Numerical results for Green's functio… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
24
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 39 publications
(24 citation statements)
references
References 48 publications
0
24
0
Order By: Relevance
“…Based on H-matrix and H 2 -matrix structure, Fan and his coauthors proposed two multiscale neural networks [20,21], which are more suitable in training smooth linear or nonlinear operators due to the multiscale nature of H-matrices. In addition to that, nonstandard wavelet form inspired the design of BCR-Net [22], which is applied to address the inverse of elliptic operator and nonlinear homogenization problem and recently been embedded in a neural network for solving electrical impedance tomography [19] and pseudo-differential operator [23]. Multigrid method also inspired MgNet [25].…”
Section: Related Workmentioning
confidence: 99%
“…Based on H-matrix and H 2 -matrix structure, Fan and his coauthors proposed two multiscale neural networks [20,21], which are more suitable in training smooth linear or nonlinear operators due to the multiscale nature of H-matrices. In addition to that, nonstandard wavelet form inspired the design of BCR-Net [22], which is applied to address the inverse of elliptic operator and nonlinear homogenization problem and recently been embedded in a neural network for solving electrical impedance tomography [19] and pseudo-differential operator [23]. Multigrid method also inspired MgNet [25].…”
Section: Related Workmentioning
confidence: 99%
“…In the recent years, deep neural networks (DNNs) have been very effective tools in a variety of contexts and have achieved great successes in computer vision, image processing, speech recognition, and many other artificial intelligence applications [39,46,33,54,50,60,48,58]. More recently, DNNs have been increasingly used in the context of scientific computing, particularly in solving PDE-related problems [43,8,35,25,3,55,47,28]. First, since neural networks offer a powerful tool for approximating high-dimensional functions [17], it is natural to use them as an ansatz for high-dimensional PDEs [57,11,35,44,20].…”
Section: Introductionmentioning
confidence: 99%
“…If so, how many pairs are required? These two questions have received significant research attention [17,31,34,43]. From data, one hopes to eventually learn physical laws of nature or conservation laws that elude scientists in the biological sciences [63], computational fluid dynamics [49], and computational physics [45].…”
Section: Introductionmentioning
confidence: 99%
“…The approach that dominates the PDE learning literature is to directly learn L by either (1) learning parameters in the PDE [4,64], (2) using neural networks to approximate the action of the PDE on functions [45][46][47][48][49], or (3) deriving a model by composing a library of operators with sparsity considerations [9,35,52,53,59,60]. Instead of trying to learn the unbounded, closed operator L directly, we follow [6,17,18] and discover the Green's function associated with L. That is, we attempt to learn the function G : D × D → R + ∪ {∞} such that [16] u j (x) = D G(x, y) f j (y)dy,…”
Section: Introductionmentioning
confidence: 99%