2021
DOI: 10.48550/arxiv.2106.11753
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Symplectic Learning for Hamiltonian Neural Networks

Abstract: Machine learning methods are widely used in the natural sciences to model and predict physical systems from observation data. Yet, they are often used as poorly understood "black boxes," disregarding existing mathematical structure and invariants of the problem. Recently, the proposal of Hamiltonian Neural Networks (HNNs) took a first step towards a unified "gray box" approach, using physical insight to improve performance for Hamiltonian systems. In this paper, we explore a significantly improved training met… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 14 publications
0
6
0
Order By: Relevance
“…In this work, models are trained on an approximation of the ODE (11) as made by some discretization method corresponding to a numerical integration scheme, as done in [24,21,8]. That is, given the integrator…”
Section: Choice Of Discretization Methods For Training Datamentioning
confidence: 99%
See 1 more Smart Citation
“…In this work, models are trained on an approximation of the ODE (11) as made by some discretization method corresponding to a numerical integration scheme, as done in [24,21,8]. That is, given the integrator…”
Section: Choice Of Discretization Methods For Training Datamentioning
confidence: 99%
“…with the Neural Ordinary Differential Equation method [5]. In this paper we follow an alternative approach and train on an integration scheme directly as done in [24,21,8], see Section 3.…”
Section: Introductionmentioning
confidence: 99%
“…In fact, following the PyTorch implementation of the mean squared error, E 1 is actually divided by 2n. Alternatively, as introduced in David and Méhats (2021), one can compare pointwise values of the approximated and the true Hamiltonian, when known. This gives…”
Section: Description Of the Problemmentioning
confidence: 99%
“…The derivatives ∂H k ∂p , ∂H k ∂q are calculated by differentiating the neural network which models the Hamiltonian, while the derivatives dp k dt , dq k dt are either assumed to be known (from the simulator) or approximated with finite differences. Using finite differences to approximate the derivatives dp dt and dq dt is essentially equivalent to Euler integration with a time step being equal to the sampling interval, which limits the accuracy of the trained model [8].…”
Section: Hamiltonian Neural Networkmentioning
confidence: 99%
“…The original HNN model [11] had the limitation of assuming the knowledge of the state derivatives with respect to time or approximating those using finite differences. Many recent works have used numerical integrators for modeling the evolution of the system state and several improvements of the integration procedure have been proposed [5,8,9,16,25,28].…”
Section: Introductionmentioning
confidence: 99%