2020
DOI: 10.1016/j.cma.2020.113028
|View full text |Cite
|
Sign up to set email alerts
|

Conservative physics-informed neural networks on discrete domains for conservation laws: Applications to forward and inverse problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
278
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 657 publications
(279 citation statements)
references
References 16 publications
1
278
0
Order By: Relevance
“…The main feature of the PINN is that it can easily incorporate all the given information like governing equation, experimental data, initial/boundary conditions, etc., into the loss function thereby recast the original problem into an optimization problem. One of the main limitation of PINN algorithm is its high computational cost for high-dimensional optimization problem, which is addressed in [13] by employing the domain decomposition approach. The PINN algorithm aims to learn a surrogate u=uboldΘfalse^ for predicting the solution u of the governing PDE.…”
Section: Methodsmentioning
confidence: 99%
“…The main feature of the PINN is that it can easily incorporate all the given information like governing equation, experimental data, initial/boundary conditions, etc., into the loss function thereby recast the original problem into an optimization problem. One of the main limitation of PINN algorithm is its high computational cost for high-dimensional optimization problem, which is addressed in [13] by employing the domain decomposition approach. The PINN algorithm aims to learn a surrogate u=uboldΘfalse^ for predicting the solution u of the governing PDE.…”
Section: Methodsmentioning
confidence: 99%
“…In addition, PINNs have been applied successfully in a wide range of applications, including fluid dynamics [113,115,117,160,177], continuum mechanics and elastodynamics [66,132,162], inverse problems [91,121], fractional advection–diffusion equations [135], stochastic advection–diffusion–reaction equations [34], stochastic differential equations [179] and power systems [127]. Finally, we mention that Gaussian processes as an alternative to neural networks for approximating complex multivariate functions have also been studied extensively for solving PDEs and inverse problems [136,155,158,164].…”
Section: Physics‐informed Neural Networkmentioning
confidence: 99%
“…The basic interface conditions for XPINN include the residual continuity condition in strong form as well as enforcing the average solution given by different Sub-Net's along the common interface. As discussed in the cPINN framework [46], for stability it is not necessary to enforce the average solution along the common interface, but the compu-tational experiments reveal that it will drastically speed-up the convergence rate. Fig.…”
Section: Extended Physics-informed Neural Networkmentioning
confidence: 99%
“…Therefore, it is crucial to accelerate the convergence of such models without sacrificing the performance. This issue was first addressed in the conservative PINN (cPINN) method for conservation laws [46] by employing the domain decomposition approach in PINN framework. Domain decomposition has been a fundamental development in standard numerical methods, e.g.…”
Section: Introductionmentioning
confidence: 99%