2022
DOI: 10.48550/arxiv.2204.08621
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Proximal Implicit ODE Solvers for Accelerating Learning Neural ODEs

Abstract: Learning neural ODEs often requires solving very stiff ODE systems, primarily using explicit adaptive step size ODE solvers. These solvers are computationally expensive, requiring the use of tiny step sizes for numerical stability and accuracy guarantees. This paper considers learning neural ODEs using implicit ODE solvers of different orders leveraging proximal operators. The proximal implicit solver consists of inner-outer iterations: the inner iterations approximate each implicit update step using a fast op… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 28 publications
0
1
0
Order By: Relevance
“…However, to address the numerical errors inherent in this approach, several techniques have been proposed. These include the checkpoint method [20,66], the asynchronous leapfrog method [67], the symplectic adjoint method [46], interpolation techniques [15], and the utilizaiton of proximal implicit solvers [4].…”
Section: Data-driven Discovery Of Dynamical Systemsmentioning
confidence: 99%
“…However, to address the numerical errors inherent in this approach, several techniques have been proposed. These include the checkpoint method [20,66], the asynchronous leapfrog method [67], the symplectic adjoint method [46], interpolation techniques [15], and the utilizaiton of proximal implicit solvers [4].…”
Section: Data-driven Discovery Of Dynamical Systemsmentioning
confidence: 99%