2020
DOI: 10.1016/j.cma.2020.113299
|View full text |Cite
|
Sign up to set email alerts
|

Geometric deep learning for computational mechanics Part I: anisotropic hyperelasticity

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
89
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 172 publications
(89 citation statements)
references
References 99 publications
0
89
0
Order By: Relevance
“…Following this line, Ref. [49] introduced a graph convolutional deep neural network, incorporating the non-Euclidean weighted graph data to predict the elastic response of materials with complex microstructures. For recent works on CNNs, we refer to [50][51][52], and the citations therein.…”
Section: Deep Learning (Dl) Architecturesmentioning
confidence: 99%
“…Following this line, Ref. [49] introduced a graph convolutional deep neural network, incorporating the non-Euclidean weighted graph data to predict the elastic response of materials with complex microstructures. For recent works on CNNs, we refer to [50][51][52], and the citations therein.…”
Section: Deep Learning (Dl) Architecturesmentioning
confidence: 99%
“…(3) by setting γ 3 = 0. An H 1 norm loss function for a hyperelastic energy functional can be seen in Vlassis et al [18].…”
Section: Sobolev Training Of a Hyperelastic Energy Functionalmentioning
confidence: 99%
“…The major issue that limits the adaptations of neural network constitutive models is the lack of interpretability and the vulnerability to overfitting. While there are existing regularization techniques such as dropout layers [16], cross-validation [17,18], and/or increasing the size of the database that could be helpful, it remains difficult to assess their credibility without the interpretability of the underlying laws deduced from the neural network. Another approach could involve symbolic regression through reinforcement learning [3] or genetic algorithms [19] that may lead to explicitly written evolution laws, however, the fitness of these equations is often at the expense of readability.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Other works have represented designs as graphs. [25] used a graph-based convolutional model to learn fluid dynamics on meshed surfaces, [27] used a similar approach to learn the structural behavior of a thin shell, and [28] learned material properties from graph-based microstructures. The closest existing work to this one is probably [29], in which graph representations of trusses were used to optimize cross section sizes for struc-tural loads.…”
Section: Surrogate Modeling Without Parametric Design Fea-mentioning
confidence: 99%