2019
DOI: 10.1137/18m1203602
|View full text |Cite
|
Sign up to set email alerts
|

A Multiscale Neural Network Based on Hierarchical Matrices

Abstract: In this work we introduce a new multiscale artificial neural network based on the structure of H-matrices. This network generalizes the latter to the nonlinear case by introducing a local deep neural network at each spatial scale. Numerical results indicate that the network is able to efficiently approximate discrete nonlinear maps obtained from discretized nonlinear partial differential equations, such as those arising from nonlinear Schrödinger equations and the Kohn-Sham density functional theory.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
67
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
1
1

Relationship

2
5

Authors

Journals

citations
Cited by 57 publications
(68 citation statements)
references
References 63 publications
1
67
0
Order By: Relevance
“…Finally, our study is potentially universal, and can be translated to numerous other solving techniques, including the Generalized Minimal RESidual (GMRES) method linear solver. This comes when the exploded interest in deep learning in recent years and approximating neural network model with H-matrix is taking foot [9,10].…”
Section: Discussionmentioning
confidence: 99%
“…Finally, our study is potentially universal, and can be translated to numerous other solving techniques, including the Generalized Minimal RESidual (GMRES) method linear solver. This comes when the exploded interest in deep learning in recent years and approximating neural network model with H-matrix is taking foot [9,10].…”
Section: Discussionmentioning
confidence: 99%
“…Since the convolution in (3.9) is global, the architectural parameters are chosen with wN cnn ≥ N s (3.10) so that the resulting network is capable of capturing global interactions. When N s is large, it is possible that the recently proposed multiscale neural networks, for example MNN-H-net [25], MNN-H 2 -net [24], and BCR-net [23], are more efficient for such global interactions. However in order to simplify the presentation, the discussion here sticks to the convolutional layers.…”
Section: Forward Problem Of Otmentioning
confidence: 99%
“…the application of K T toλ can be approximated with a one-dimensional convolutional neural network, similar to K. For the part K T K + I, which can be viewed as a post-processing in the (ρ, θ) space, we implement this with several two-dimensional convolutional layers for simplicity. However, for problems with larger sizes, multiscale neural networks such as [25,24,23] can be also used. The resulting architecture for the inverse map is summarized in Algorithm 2 and illustrated in Section 3.2…”
Section: Inverse Problem Of Otmentioning
confidence: 99%
See 2 more Smart Citations