2021
DOI: 10.48550/arxiv.2107.10991
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A novel meta-learning initialization method for physics-informed neural networks

Xu Liu,
Xiaoya Zhang,
Wei Peng
et al.

Abstract: Physics-informed neural networks (PINNs) have been widely used to solve various scientific computing problems. However, large training costs limit PINNs for some real-time applications. Although some works have been proposed to improve the training efficiency of PINNs, few consider the influence of initialization. To this end, we propose a New Reptile initialization based Physics-Informed Neural Network (NRPINN). The original Reptile algorithm is a meta-learning initialization method based on labeled data. PIN… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(6 citation statements)
references
References 47 publications
0
6
0
Order By: Relevance
“…Compared to the traditional loss function that uses labeled data for computing the deviations, the physical term does not need labels and works for both supervised and unsupervised methods. The integration of physics-based terms into loss functions has shown benefits as shown in [43] and [44]. In [43], a physics-based loss function is constructed and added to the main loss function by using the physical meanings between density, temperature, and depth of water to model the water temperature in a lake.…”
Section: A Pi Strategies In Supervised and Semi-supervised Learning M...mentioning
confidence: 99%
“…Compared to the traditional loss function that uses labeled data for computing the deviations, the physical term does not need labels and works for both supervised and unsupervised methods. The integration of physics-based terms into loss functions has shown benefits as shown in [43] and [44]. In [43], a physics-based loss function is constructed and added to the main loss function by using the physical meanings between density, temperature, and depth of water to model the water temperature in a lake.…”
Section: A Pi Strategies In Supervised and Semi-supervised Learning M...mentioning
confidence: 99%
“…In most works, researchers resort to the Xavier initialisation [14] for selecting the PINN's initial weights and biases. The effects of using more refined initialisation procedures has recently been gaining in attention, with [39] showing that a good initialisation can provide PINNs with a head start, allowing them to achieve fast convergence and improved accuracy. Transfer learning for PINNs was introduced by [5] and [15] to initialise PINNs for dealing with multi-fidelity problems and brittle fracture problems, respectively.…”
Section: State-of-the-art and Related Workmentioning
confidence: 99%
“…Subsequently, [44] proposed the REPTILE algorithm, which turns the second-order optimisation of MAML into a first-order approximation and therefore requires significantly less computation and memory while achieving similar performance. [39] applied the REPTILE algorithm to PINNs by regarding modifications of PDE parameters as separate tasks. The resulting initialisation is such that the PINN converges in just a few optimisation steps for any choice of PDE parameters.…”
Section: State-of-the-art and Related Workmentioning
confidence: 99%
“…In [20], authors report a significant error reduction by correcting solutions using neural networks trained with PDE solvers. In this line of research, there are several works where meta-learning approaches are taken to solve computational problems [23,25,26,27,28,29]. Meta-learning, or learning to learn, leverages previous learning experiences to improve future learning performance [30], which fits the motivation of utilizing the data from previously solved equations for the next one.…”
Section: Introductionmentioning
confidence: 99%
“…For instance, [25] uses meta-learning to generate smoothers of the Multi-grid Network for parametrized PDEs, and [26] proposes a meta-learning approach to learn effective solvers based on the Runge-Kutta method for ordinary differential equations (ODEs). In [27,28], meta-learning is used to accelerate the training of physics-informed neural networks for solving PDEs.…”
Section: Introductionmentioning
confidence: 99%