2020
DOI: 10.48550/arxiv.2006.03364
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Structure preserving deep learning

Abstract: Over the past few years, deep learning has risen to the foreground as a topic of massive interest, mainly as a result of successes obtained in solving large-scale image processing tasks. There are multiple challenging mathematical problems involved in applying deep learning: most deep learning methods require the solution of hard optimisation problems, and a good understanding of the tradeoff between computational effort, amount of data and model complexity is required to successfully design a deep learning ap… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 64 publications
(122 reference statements)
0
7
0
Order By: Relevance
“…Thus, the learned time integrator should fulfill the first principles of physics to provide credible predictions of future events to help in decision-making. By structure-preserving neural networks (SPNN) we refer to a class of techniques that are constructed to satisfy some a priori known properties of the problem such as equivariance [71] or energy conservation [57] [60]. In their most general form, they can be applied to conservative as well as dissipative problems, in which the principles of thermodynamics are satisfied by construction [39] [38].…”
Section: Learning the Dynamical Evolution Based On Structure-preservi...mentioning
confidence: 99%
“…Thus, the learned time integrator should fulfill the first principles of physics to provide credible predictions of future events to help in decision-making. By structure-preserving neural networks (SPNN) we refer to a class of techniques that are constructed to satisfy some a priori known properties of the problem such as equivariance [71] or energy conservation [57] [60]. In their most general form, they can be applied to conservative as well as dissipative problems, in which the principles of thermodynamics are satisfied by construction [39] [38].…”
Section: Learning the Dynamical Evolution Based On Structure-preservi...mentioning
confidence: 99%
“…Note that we do not use the standard Tikhonov regularization (so called weight decay) on the weights as they do not guarantee smoothness in time which is crucial for reversible networks and integration in time (Celledoni et al, 2020).…”
Section: Regularizationmentioning
confidence: 99%
“…Thus, it is fitting that a geometric integrator is used to solve optimal control problem to guarantee that the numerical solution is representative of the control system. First, we describe how optimal is used to formulate the training of deep neural network in section 2.3 and, here, we adhere to the approach of [10,20,67]. The resultant Hamilton's equations are then solved numerically using structure-preserving integrator.…”
Section: Hamilton-jacobi Integrators For Deep Learningmentioning
confidence: 99%