2022
DOI: 10.3389/fevo.2022.1010278
|View full text |Cite
|
Sign up to set email alerts
|

Automatic differentiation and the optimization of differential equation models in biology

Abstract: A computational revolution unleashed the power of artificial neural networks. At the heart of that revolution is automatic differentiation, which calculates the derivative of a performance measure relative to a large number of parameters. Differentiation enhances the discovery of improved performance in large models, an achievement that was previously difficult or impossible. Recently, a second computational advance optimizes the temporal trajectories traced by differential equations. Optimization requires dif… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 11 publications
(11 citation statements)
references
References 22 publications
0
11
0
Order By: Relevance
“…Efficient optimization of large models typically gains greatly from automatic differentiation (Baydin et al, 2018 ; Margossian, 2019 ). In essence, the computer code automatically analyzes the exact derivatives of the loss function with respect to each parameter, rapidly calculating the full gradient that allows the optimization process to move steadily in the direction that improves the fit (Frank, 2022a ).…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Efficient optimization of large models typically gains greatly from automatic differentiation (Baydin et al, 2018 ; Margossian, 2019 ). In essence, the computer code automatically analyzes the exact derivatives of the loss function with respect to each parameter, rapidly calculating the full gradient that allows the optimization process to move steadily in the direction that improves the fit (Frank, 2022a ).…”
Section: Methodsmentioning
confidence: 99%
“…Challenges with Bonnaffé et al's method include long computation times, limited flexibility with regard to studying alternative or larger models within the same computational framework, and lack of connection to rapidly developing technical advances in automatic differentiation (Frank, 2022a ). A recent update by Bonnaffé and Coulson ( 2022 ) provides an alternative solution for some of these challenges.…”
Section: Introductionmentioning
confidence: 99%
“…This is to provide an example of a longer time series, and to offer a point of comparison with previous and future implementations of NODEs, which commonly use this time series (e.g. Bonnaffé, Sheldon, et al, 2021; Frank, 2022).…”
Section: Case Studiesmentioning
confidence: 99%
“…9 Computationally, they scale very well to higher dimensions, can easily use automatic differentiation for optimization, and can be embedded within stochastic differential equations while maintaining the great computational advantages of automatic differentiation. [10][11][12] Without these technical advantages, computational optimization of TF networks is difficult and has not previously been studied in a widely applicable way. The computer code provided with this article can easily be adapted to study other biological challenges.…”
Section: Introductionmentioning
confidence: 99%
“…9 Computationally, they scale very well to higher dimensions, can easily use automatic differentiation for optimization, and can be embedded within stochastic differential equations while maintaining the great computational advantages of automatic differentiation. 10–12…”
Section: Introductionmentioning
confidence: 99%