2022
DOI: 10.48550/arxiv.2204.06684
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Multifidelity deep neural operators for efficient learning of partial differential equations with application to fast inverse design of nanoscale heat transport

Abstract: Deep neural operators can learn operators mapping between infinite-dimensional function spaces via deep neural networks and have become an emerging paradigm of scientific machine learning. However, training neural operators usually requires a large amount of high-fidelity data, which is often difficult to obtain in real engineering problems. Here, we address this challenge by using multifidelity learning, i.e., learning from multifidelity datasets. We develop a multifidelity neural operator based on a deep ope… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 48 publications
(65 reference statements)
0
7
0
Order By: Relevance
“…-To reduce the dependency of large paired datasets (when accurate information about the governing law of the physical system is not available), Deep Transfer Operator Learning (DeepONet) [8] can be used for accurate prediction of quantities of interest in related domains, with only a handful of new measurements. -Develop faster ways to train neural operators by incorporating multimodality and multifidelity data [55,56,57]. -The application of neural operators in life sciences is endless.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…-To reduce the dependency of large paired datasets (when accurate information about the governing law of the physical system is not available), Deep Transfer Operator Learning (DeepONet) [8] can be used for accurate prediction of quantities of interest in related domains, with only a handful of new measurements. -Develop faster ways to train neural operators by incorporating multimodality and multifidelity data [55,56,57]. -The application of neural operators in life sciences is endless.…”
Section: Discussionmentioning
confidence: 99%
“…To that end, training a DeepONet or any other neural operator with only high fidelity data would be computationally expensive. One promising way to solve such realistic problems is using multifidelity approaches proposed in [57,55,56]. These real problems take leverage of the generalized flavor of DeepONet, which can be flexibly designed for any problem at hand.…”
Section: Layout 2: 64 Turbinesmentioning
confidence: 99%
“…DeepXDE also features NNs for nonlinear operator learning such as DeepONet [97], POD-DeepONet [98], MIONet [99], DeepM&Mnet [100,101] and multifidelity DeepONet [102].…”
Section: Deepxdementioning
confidence: 99%
“…However, to train DNNs, solutions of PDEs are required as a training dataset, which must be analytically or numerically provided in advance. The deep operator network (DeepONet, [18]) has been also robustly studied lately in multiple fields, such as multiphysics and multiscale problems in [50], [51], a method using Fourier Neural Operator in [52], a method for multiple-input operators in [53], and a method for input data with various degrees of accuracy in [54]. The authors in [20], [21] introduced an operator learning approach that can learn multiple PDE solutions.…”
Section: Related Workmentioning
confidence: 99%