2021 IEEE High Performance Extreme Computing Conference (HPEC) 2021
DOI: 10.1109/hpec49654.2021.9622796
|View full text |Cite
|
Sign up to set email alerts
|

A Comparison of Automatic Differentiation and Continuous Sensitivity Analysis for Derivatives of Differential Equation Solutions

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
31
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 40 publications
(31 citation statements)
references
References 19 publications
0
31
0
Order By: Relevance
“…The computational complexity to get the derivatives is in O ( np ) where n is the number of time steps and p is the number of parameters 24 . While the scaling of the forward AD is the same as the FD, the forward AD is faster than FD for small ODE systems (<100 parameters) as illustrated in the referenced article 24 because forward AD calculates derivatives simultaneously with function evaluations. In the reverse AD, the derivative of θ n +1 with respect to the LHS is back propagated to the variables in the RHS.…”
Section: Methodsmentioning
confidence: 99%
“…The computational complexity to get the derivatives is in O ( np ) where n is the number of time steps and p is the number of parameters 24 . While the scaling of the forward AD is the same as the FD, the forward AD is faster than FD for small ODE systems (<100 parameters) as illustrated in the referenced article 24 because forward AD calculates derivatives simultaneously with function evaluations. In the reverse AD, the derivative of θ n +1 with respect to the LHS is back propagated to the variables in the RHS.…”
Section: Methodsmentioning
confidence: 99%
“…Another technique one can use to calculate the derivatives of model solutions is to differentiate the numerical algorithm that calculates the solution. This can be done with computational tools collectively known as automatic differentiation [19]. Forward mode automatic differentiation is performed by carrying forward Jacobian-vector products at each successive calculation.…”
Section: Automatic Differentiationmentioning
confidence: 99%
“…In the following sections we demonstrate the approach on two problems. This work is coded in Julia [18], using the following packages: DifferentialEquations.jl [19], [20], DiffEqFlux [2], [17], Flux.jl [21], ForwardDiff.jl [22], NLopt.jl [23] and Hyperopt.jl [24].…”
Section: Multiple Shooting With Neural Desmentioning
confidence: 99%