2023
DOI: 10.1007/s00466-022-02260-0
|View full text |Cite
|
Sign up to set email alerts
|

FE$${}^\textrm{ANN}$$: an efficient data-driven multiscale approach based on physics-constrained neural networks and automated data mining

Abstract: Herein, we present a new data-driven multiscale framework called FE$${}^\textrm{ANN}$$ ANN which is based on two main keystones: the usage of physics-constrained artificial neural networks (ANNs) as macroscopic surrogate models and an autonomous data mining process. Our approach allows the efficient simulation of materials with complex underlying microstructures which reveal an overall anisotropic and nonlinear behavior on th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
30
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 49 publications
(30 citation statements)
references
References 89 publications
0
30
0
Order By: Relevance
“…This clarifies that such a network may be arbitrarily nested, but for fixed w l nm is a well-defined function of the input I. In this sense, the network as a function (10) maps the input I to the output O. In order to adapt the weights w l nm such that the predictions of the NN match the expected values, a training data set  ∶= { 1 , … ,  N ds } consisting of N ds ∈ N data tuples  𝛼 is required.…”
Section: Feedforward Neural Networkmentioning
confidence: 89%
See 3 more Smart Citations
“…This clarifies that such a network may be arbitrarily nested, but for fixed w l nm is a well-defined function of the input I. In this sense, the network as a function (10) maps the input I to the output O. In order to adapt the weights w l nm such that the predictions of the NN match the expected values, a training data set  ∶= { 1 , … ,  N ds } consisting of N ds ∈ N data tuples  𝛼 is required.…”
Section: Feedforward Neural Networkmentioning
confidence: 89%
“…Consequently, the activations must be differentiable twice to enable an optimization. 10,11,48 Therefore, the hyperbolic tangent and the softplus activation are used for the corresponding networks in the following.…”
Section: Neural Network Enforcing Physics In a Weak Formmentioning
confidence: 99%
See 2 more Smart Citations
“…This was extended by Fuhg and Bouklas [37] to anisotropic materials, strictly enforcing known symmetries up to transverse isotropy; this work showed that this simplified learning approach led to significant generalization capabilities when the physics do not radically change outside of the training region. This approach was also utilized in Kalina et al [41] integrated in a multiscale framework with automated data-mining. In Fuhg et al [32], tensor basis NNs were utilized to discover the character of the anisotropy of the material through labeled data.…”
Section: Introductionmentioning
confidence: 99%