2021
DOI: 10.1016/j.jcp.2020.109907
|View full text |Cite
|
Sign up to set email alerts
|

On some neural network architectures that can represent viscosity solutions of certain high dimensional Hamilton–Jacobi partial differential equations

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
42
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 51 publications
(42 citation statements)
references
References 55 publications
0
42
0
Order By: Relevance
“…SciML includes a variety of techniques aiming at seamlessly combining observational data with available physical models. They can be divided into three main categories: (1) the ones that seek to emulate physical systems by utilizing large data amounts, such as the deep operator network (DeepONet) developed by Lu et al [6] (see also [7,8] for different versions of DeepONet and [9] for theoretical results, as well as [10] for a different approach termed Fourier neural operator); (2) the ones that encode physics into the NN architecture, such as the architectures developed by Darbon and Meng [11]; and (3) the ones that add physical soft constraints in the NN optimization process, such as the physics-informed neural network (PINN) developed by Raissi et al [12] (see also [13][14][15][16][17][18] for different versions of PINN and [19,20] for theoretical results). These techniques flourish, in principle, where the applicability of conventional solvers diminishes.…”
Section: Motivation and Scope Of The Papermentioning
confidence: 99%
“…SciML includes a variety of techniques aiming at seamlessly combining observational data with available physical models. They can be divided into three main categories: (1) the ones that seek to emulate physical systems by utilizing large data amounts, such as the deep operator network (DeepONet) developed by Lu et al [6] (see also [7,8] for different versions of DeepONet and [9] for theoretical results, as well as [10] for a different approach termed Fourier neural operator); (2) the ones that encode physics into the NN architecture, such as the architectures developed by Darbon and Meng [11]; and (3) the ones that add physical soft constraints in the NN optimization process, such as the physics-informed neural network (PINN) developed by Raissi et al [12] (see also [13][14][15][16][17][18] for different versions of PINN and [19,20] for theoretical results). These techniques flourish, in principle, where the applicability of conventional solvers diminishes.…”
Section: Motivation and Scope Of The Papermentioning
confidence: 99%
“…In [40], an approach for solving a certain kind of high‐dimensional first‐order Hamilton–Jacobi equations is proposed based the Hopf formula [77] whose computational expense behaves polynomially in the spatial dimension. In subsequent work, first‐order Hamilton–Jacobi equations in high dimension are considered in [38,39] based on classes of neural networks that exactly encode the viscosity solutions of these equations.…”
Section: Extensions and Related Workmentioning
confidence: 99%
“…In addition to the aforementioned data-driven machine learning techniques, physics-informed machine learning techniques, also known as physics-informed neural networks [156,157], have emerged as an alternative to illposed and inverse problems. Fundamental physical laws and domain knowledge are embedded by exploiting observational data [158,159], tailoring neural network architecture for physics constraints [160,161], and/or imposing physics constraints into the loss function [162,163]. The physics-informed neural network is expected to outperform existing machine learning methods in partially understood, uncertain, and high-dimensional problems.…”
Section: Metamodel-based Optimization Enhanced By Machine Learningmentioning
confidence: 99%