2020
DOI: 10.48550/arxiv.2003.04617
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Differentiate Everything with a Reversible Embeded Domain-Specific Language

Abstract: This paper considers the source-to-source automatic differentiation (AD) in a reversible language. We start by reviewing the limitations of traditional AD frameworks. To solve the issues in these frameworks, we developed a reversible eDSL NiLang in Julia that can differentiate a general program while being compatible with Julia's ecosystem. It empowers users the flexibility to tradeoff time, space, and energy so that one can use it to obtain gradients and Hessians ranging for elementary mathematical functions,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
5
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(6 citation statements)
references
References 62 publications
0
5
0
1
Order By: Relevance
“…Automatic differentiation is an algorithm implemented through computer programs to compute derivatives of specified functions. Its application in scientific computing has become increasingly widespread [29][30][31][32][33]. There are two forms of automatic differentiation: forward mode and reverse mode.…”
Section: Automatic Differentiationmentioning
confidence: 99%
“…Automatic differentiation is an algorithm implemented through computer programs to compute derivatives of specified functions. Its application in scientific computing has become increasingly widespread [29][30][31][32][33]. There are two forms of automatic differentiation: forward mode and reverse mode.…”
Section: Automatic Differentiationmentioning
confidence: 99%
“…The contraction of the tropical tensor network on a GPU shows nice acceleration (>20X) compared to a CPU due to the high arithmetic intensity of the computation. We further compared the performance of finding out the ground state configuration using the forward mode (ForwardDiff.jl [50]) and reverse mode (Nilang.jl [51]) automatic differentia- tion respectively. The reverse mode automatic differentiation is more efficient in this application than the forward model AD which has computational complexity proportional to the number of parameter L 2 .…”
Section: ⊗Lmentioning
confidence: 99%
“…Instead of deriving the backward rule manually, we differentiate the source codes by writing it in a reversible programming manner [51]. Due to the overhead of reversible programming, the memory usage of our reversible implementation is 2L times the original program, while the computational time is also several times slower.…”
Section: Reversible Programming Approach To Compute Gradientsmentioning
confidence: 99%
See 2 more Smart Citations