2019
DOI: 10.1016/j.jcp.2019.02.002
|View full text |Cite
|
Sign up to set email alerts
|

BCR-Net: A neural network based on the nonstandard wavelet form

Abstract: This paper proposes a novel neural network architecture inspired by the nonstandard form proposed by Beylkin, Coifman, and Rokhlin in [Communications on Pure and Applied Mathematics, 44(2), 141-183]. The nonstandard form is a highly effective wavelet-based compression scheme for linear integral operators. In this work, we first represent the matrix-vector product algorithm of the nonstandard form as a linear neural network where every scale of the multiresolution computation is carried out by a locally connect… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
39
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 47 publications
(39 citation statements)
references
References 41 publications
0
39
0
Order By: Relevance
“…Based on H-matrix and H 2 -matrix structure, Fan and his coauthors proposed two multiscale neural networks [20,21], which are more suitable in training smooth linear or nonlinear operators due to the multiscale nature of H-matrices. In addition to that, nonstandard wavelet form inspired the design of BCR-Net [22], which is applied to address the inverse of elliptic operator and nonlinear homogenization problem and recently been embedded in a neural network for solving electrical impedance tomography [19] and pseudo-differential operator [23]. Multigrid method also inspired MgNet [25].…”
Section: Related Workmentioning
confidence: 99%
“…Based on H-matrix and H 2 -matrix structure, Fan and his coauthors proposed two multiscale neural networks [20,21], which are more suitable in training smooth linear or nonlinear operators due to the multiscale nature of H-matrices. In addition to that, nonstandard wavelet form inspired the design of BCR-Net [22], which is applied to address the inverse of elliptic operator and nonlinear homogenization problem and recently been embedded in a neural network for solving electrical impedance tomography [19] and pseudo-differential operator [23]. Multigrid method also inspired MgNet [25].…”
Section: Related Workmentioning
confidence: 99%
“…with periodic boundary condition. The second positivity constraint in (12) can be dropped since if upxq is the eigenvector associated to the smallest eigenvalue then so is´upxq. Besides, the righthand side of the first constraint in (12) can take any positive constant since this eigenvalue problem is linear.…”
Section: Linear Schrödinger Equationmentioning
confidence: 99%
“…The second positivity constraint in (12) can be dropped since if upxq is the eigenvector associated to the smallest eigenvalue then so is´upxq. Besides, the righthand side of the first constraint in (12) can take any positive constant since this eigenvalue problem is linear. The external potential is randomly generated to simulate crystal with two different atoms in each unit cell, i.e., V pxq is randomly generated via,…”
Section: Linear Schrödinger Equationmentioning
confidence: 99%
See 1 more Smart Citation
“…This paper is concerned with a more ambitious task: representing the nonlinear map from η to G η M : η → G η = L −1 η , (1.2) approximate high-dimensional solutions of high-dimensional PDEs [36,52,14,46,4,6,13,25,31,39]. In a somewhat orthogonal direction, the NNs have been utilized to approximate the high-dimensional parameterto-solution of various PDEs and IEs [30,26,18,17,19,32,25,2,38,20].…”
Section: Introductionmentioning
confidence: 99%