2022
DOI: 10.1088/2632-2153/ac3712
|View full text |Cite
|
Sign up to set email alerts
|

Inverse Dirichlet weighting enables reliable training of physics informed neural networks

Abstract: We characterize and remedy a failure mode that may arise from multi-scale dynamics with scale imbalances during training of deep neural networks, such as Physics Informed Neural Networks (PINNs). PINNs are popular machine-learning templates that allow for seamless integration of physical equation models with data. Their training amounts to solving an optimization problem over a weighted sum of data-fidelity and equation-fidelity objectives. Conflicts between objectives can arise from scale imbalances, heterosc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
39
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 33 publications
(39 citation statements)
references
References 46 publications
0
39
0
Order By: Relevance
“…Our results are in line with the frequent observation that the PDE loss weight is an important hyperparameter. Several works have put forth methodologies to choose the weights adaptively during training 8 11 , but in practice they have also been chosen via trial-and-error 43 , 55 , 56 . However, in settings with noisy data, it can not be expected that both data loss and PDE loss become zero.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Our results are in line with the frequent observation that the PDE loss weight is an important hyperparameter. Several works have put forth methodologies to choose the weights adaptively during training 8 11 , but in practice they have also been chosen via trial-and-error 43 , 55 , 56 . However, in settings with noisy data, it can not be expected that both data loss and PDE loss become zero.…”
Section: Discussionmentioning
confidence: 99%
“…Another challenge in training PINNs is balancing boundary, initial and PDE loss terms. This challenge has been addressed by adaptive weighting strategies 8 11 , as well as theory of functional connections 12 , 13 . Despite these challenges, the effectiveness of the method has been demonstrated in a wide range of works, examples include turbulent flows 14 , heat transfer 15 , epidemiological compartmental models 16 or stiff chemical systems 17 .…”
Section: Introductionmentioning
confidence: 99%
“…Maddu et al [13] suggest a multiobjective descent method that adaptively updates the weights using an inverse Dirichlet strategy to avoid premature termination. While they do not discuss convergence guarantees, their numerical results show a good performance in comparison with recent adaptations of multiobjective descent methods [14,15] to PINN training [16].…”
Section: Related Researchmentioning
confidence: 99%
“…The importance of objectives in view of the Pareto front has been analyzed in [19]. Part of the aforementioned study are the effects of system parameters such as the computational domain, and efficiency of adaptive loss weighting schemes [20,21]. In general, adaptive approaches have become popular in the optimization of PINNs [22,23], including adaptive activation functions [24,25] and soft attention mechanisms [26,27].…”
Section: Related Workmentioning
confidence: 99%