2021
DOI: 10.1145/3450626.3459754
|View full text |Cite
|
Sign up to set email alerts
|

High-order differentiable autoencoder for nonlinear model reduction

Abstract: This paper provides a new avenue for exploiting deep neural networks to improve physics-based simulation. Specifically, we integrate the classic Lagrangian mechanics with a deep autoencoder to accelerate elastic simulation of deformable solids. Due to the inertia effect, the dynamic equilibrium cannot be established without evaluating the second-order derivatives of the deep autoencoder network. This is beyond the capability of off-the-shelf automatic differentiation packages and algorithms, which mainly focus… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 26 publications
(4 citation statements)
references
References 80 publications
0
4
0
Order By: Relevance
“…[FMD*19] use an autoencoder to construct nonlinear subspaces automatically. Some algorithms have been used to separate the subspace apart and learn only the nonlinear corrections, such as those applied in [RCPO21] and [SYS*21]. Beyond model reduction, NNWarp in [LSW*18] makes use of simple networks to correct linear nodal deformation to nonlinear ones.…”
Section: Related Workmentioning
confidence: 99%
“…[FMD*19] use an autoencoder to construct nonlinear subspaces automatically. Some algorithms have been used to separate the subspace apart and learn only the nonlinear corrections, such as those applied in [RCPO21] and [SYS*21]. Beyond model reduction, NNWarp in [LSW*18] makes use of simple networks to correct linear nodal deformation to nonlinear ones.…”
Section: Related Workmentioning
confidence: 99%
“…These methods unfortunately scale in complexity with the number of rotations to be tracked, of which there may be many in a largescale heterogeneous material. While non-linear subspaces via Deep Neural Networks have also been proposed [Shen et al 2021], the resulting complexity of the subspace requires many optimization steps in order to reach a solution [Sharp et al 2023].…”
Section: Related Work 21 Subspaces For Heterogeneous Materialsmentioning
confidence: 99%
“…Other works have also used ideas similar to kinematic filters to transfer data between representations, e.g, between grids and particles [JSS*15], or between grids and rigid modes [FLLP13]. Some recent works constrain motion to the null‐space of predefined reduced spaces; some use kinematic filters to implement the constraint efficiently [SYS*21, RCPO22], while others could replace Lagrange multiplier formulations with kinematic filters for higher performance [ZBLJ20].…”
Section: Related Workmentioning
confidence: 99%