2022
DOI: 10.1609/aaai.v36i8.20799
|View full text |Cite
|
Sign up to set email alerts
|

HoD-Net: High-Order Differentiable Deep Neural Networks and Applications

Abstract: We introduce a deep architecture named HoD-Net to enable high-order differentiability for deep learning. HoD-Net is based on and generalizes the complex-step finite difference (CSFD) method. While similar to classic finite difference, CSFD approaches the derivative of a function from a higher-dimension complex domain, leading to highly accurate and robust differentiation computation without numerical stability issues. This method can be coupled with backpropagation and adjoint perturbation methods for an effic… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 33 publications
0
2
0
Order By: Relevance
“…Improving scalability and enhancing optimization efficiency is of interest. This could potentially be achieved by applying high-order differentiable network architectures [Shen et al 2022] and optimizing with more effective gradient descent involving multiple learning objectives [Dong et al 2022]. Applying 3D Gaussian splatting techniques [Kerbl et al 2023] and recent advancements in reconstruction under sparse view settings holds promise for improving density reconstruction.…”
Section: Limitationsmentioning
confidence: 99%
“…Improving scalability and enhancing optimization efficiency is of interest. This could potentially be achieved by applying high-order differentiable network architectures [Shen et al 2022] and optimizing with more effective gradient descent involving multiple learning objectives [Dong et al 2022]. Applying 3D Gaussian splatting techniques [Kerbl et al 2023] and recent advancements in reconstruction under sparse view settings holds promise for improving density reconstruction.…”
Section: Limitationsmentioning
confidence: 99%
“…The lROB method is easily extensible to ECSW [17], although special attention must be paid to problems including path-dependent materials. Autoencoder (AE) is a global nonlinear approach and has recently been used by [18,19]. In the AE ROM, the reduced basis is replaced by the Jacobian matrix of the AE.…”
Section: Introductionmentioning
confidence: 99%