Proceedings of the 14th ACM SIGPLAN International Conference on Functional Programming 2009
DOI: 10.1145/1596550.1596579
|View full text |Cite
|
Sign up to set email alerts
|

Beautiful differentiation

Abstract: Automatic differentiation (AD) is a precise, efficient, and convenient method for computing derivatives of functions. Its forwardmode implementation can be quite simple even when extended to compute all of the higher-order derivatives as well. The higherdimensional case has also been tackled, though with extra complexity. This paper develops an implementation of higher-dimensional, higher-order, forward-mode AD in the extremely general and elegant setting of calculus on manifolds and derives that implementatio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
7
0

Year Published

2010
2010
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 36 publications
(7 citation statements)
references
References 12 publications
0
7
0
Order By: Relevance
“…Recent AD systems, such as MYIA, TANGENT, and those in footnote 1, as well as the HASKELL AD package available on Cabal (Kmett, 2010), the "Beautiful Differentiation" system (Elliott, 2009), and the "Compiling to Categories" system (Elliott, 2017), have been implemented for higher-order languages like SCHEME, ML, HASKELL, F7, PYTHON, LUA, and JULIA. One by one, many of these systems have come to discover the kind of perturbation confusion reported by Siskind & Pearlmutter (2005) and have come to implement the tagging mechanisms reported by and .…”
Section: Resultsmentioning
confidence: 99%
“…Recent AD systems, such as MYIA, TANGENT, and those in footnote 1, as well as the HASKELL AD package available on Cabal (Kmett, 2010), the "Beautiful Differentiation" system (Elliott, 2009), and the "Compiling to Categories" system (Elliott, 2017), have been implemented for higher-order languages like SCHEME, ML, HASKELL, F7, PYTHON, LUA, and JULIA. One by one, many of these systems have come to discover the kind of perturbation confusion reported by Siskind & Pearlmutter (2005) and have come to implement the tagging mechanisms reported by and .…”
Section: Resultsmentioning
confidence: 99%
“…Symbolic differentiation [36] is used for many purposes. It is used to symbolically expand known derivatives:…”
Section: Differentiationmentioning
confidence: 99%
“…Symbolic Jacobian matrices consisting of derivatives have many applications, for example to speed up simulation runtime [19]. Such a matrix is often computed using automatic differentiation [36] that combines symbolic differentiation with other techniques to achieve fast computation. If there is no symbolic Jacobian available, a numerical Jacobian might instead be estimated by the numerical solvers.…”
Section: Differentiationmentioning
confidence: 99%
“…Symbolic differentiation (Elliott, 2009) is used for many purposes. It is used to symbolically expand known derivatives (19) or as an operation during index reduction.…”
Section: Differentiationmentioning
confidence: 99%
“…If there is no symbolic Jacobian available, a numerical one might instead be estimated by the numerical solvers. Such a matrix is often computed using automatic differentiation (Elliott, 2009) which combines symbolic and/or automatic differentiation with other techniques to achieve fast computation.…”
Section: Differentiationmentioning
confidence: 99%