2021
DOI: 10.48550/arxiv.2111.02484
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Accelerated replica exchange stochastic gradient Langevin diffusion enhanced Bayesian DeepONet for solving noisy parametric PDEs

Abstract: The Deep Operator Networks (DeepONet) is a fundamentally different class of neural networks that we train to approximate nonlinear operators, including the solution operator of parametric partial differential equations (PDE). DeepONets have shown remarkable approximation and generalization capabilities even when trained with relatively small datasets. However, the performance of DeepONets deteriorates when the training data is polluted with noise, a scenario that occurs very often in practice. To enable DeepON… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(7 citation statements)
references
References 25 publications
0
7
0
Order By: Relevance
“…The objective of this section is to demonstrate the advantage of U-DeepONet as compared to standard DeepONet, which does not provide epistemic uncertainty estimates. Concurrently with the present work, Lin et al [92] employed replica-exchange MCMC in conjunction with DeepONet for operator learning. Their model is a version of U-DeepONet, with MCMC as the posterior inference algorithm.…”
Section: U-deeponet: Combining Deeponet With Deep Ensemble For Incorp...mentioning
confidence: 93%
See 2 more Smart Citations
“…The objective of this section is to demonstrate the advantage of U-DeepONet as compared to standard DeepONet, which does not provide epistemic uncertainty estimates. Concurrently with the present work, Lin et al [92] employed replica-exchange MCMC in conjunction with DeepONet for operator learning. Their model is a version of U-DeepONet, with MCMC as the posterior inference algorithm.…”
Section: U-deeponet: Combining Deeponet With Deep Ensemble For Incorp...mentioning
confidence: 93%
“…( 8). Note that concurrently with the present work, Markov chain Monte Carlo has been employed for posterior inference in conjunction with DeepONet-based operator learning [92].…”
Section: Uncertainty In Deeponetsmentioning
confidence: 99%
See 1 more Smart Citation
“…Among these approaches, DeepONet has been applied and demonstrated good performance in diverse applications, such as high-speed boundary layer problems [7], multiphysics and multiscale problems of hypersonics [31] and electroconvection [2], multiscale bubble growth dynamics [22,23], fractional derivative operators [27], stochastic differential equations [27], solar-thermal system [34], and aortic dissection [45]. Several extensions of DeepONet have also been developed, such as Bayesian DeepONet [24], DeepONet with proper orthogonal decomposition (POD-DeepONet) [28], multiscale DeepONet [25], neural operator with coupled attention [13], and physics-informed DeepONet [42,9].…”
Section: Introductionmentioning
confidence: 99%
“…Different architectures of deep neural operators have been developed, such as deep operator networks (DeepONet) [11,12,14], Fourier neural operators [13,14], nonlocal kernel networks [15], and several others [16,17,18]. Among these deep neural operators, DeepONet was the first one to be proposed (in 2019) [11], and many subsequent extensions and improvements have been developed, such as DeepONet with proper orthogonal decomposition (POD-DeepONet) [14], DeepONet for multiple-input operators (MIONet) [19], DeepONet for multi-physics problems via physics decomposition (DeepM&Mnet) [20,21], DeepONet with uncertainty quantification [22,23,24], multiscale DeepONet [25], POD-DeepONet with causality [26], and physics-informed DeepONet [27,28].…”
Section: Introductionmentioning
confidence: 99%