2023
DOI: 10.1016/j.jcp.2023.112180
|View full text |Cite
|
Sign up to set email alerts
|

Accelerating discrete dislocation dynamics simulations with graph neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 46 publications
0
5
0
Order By: Relevance
“…We used the processor from MeshGraphNets 41 for implementing graph convolutions. Although we have consid-ered rotationally equivariant GNNs for predicting the noise ε (a vector quantity), the simple design of MeshGraphNets renders the computation to be fast without much loss to accuracy, as demonstrated in previous surrogate models for dislocation 42,43 and microstructure evolution. 44 The computational speed is an important aspect when performing long rollouts that would take hundreds of thousands of function evaluations.…”
Section: And An Mlp Mmentioning
confidence: 99%
“…We used the processor from MeshGraphNets 41 for implementing graph convolutions. Although we have consid-ered rotationally equivariant GNNs for predicting the noise ε (a vector quantity), the simple design of MeshGraphNets renders the computation to be fast without much loss to accuracy, as demonstrated in previous surrogate models for dislocation 42,43 and microstructure evolution. 44 The computational speed is an important aspect when performing long rollouts that would take hundreds of thousands of function evaluations.…”
Section: And An Mlp Mmentioning
confidence: 99%
“…We largely adopt the MeshGraphNet of references [34,35], a type of message-passing graph neural network. We used 48 hidden features in 2D and 64 in 3D.…”
Section: Message-passing Graph Neural Networkmentioning
confidence: 99%
“…Following [34,35], our GNN starts with vertex and edge encoders ENC V , ENC E transforming concatenated input features into a latent space:…”
Section: Message-passing Graph Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…The dislocations are then represented as random walkers which jump between neighboring nodes following a Poisson process. More recently, Bertin and Zhou [33] developed a Graph Neural Network (GNN) approach to model DDD, with the aim of replacing its time-integration procedure. They demonstrated that this DDD-GNN approach can learn ground-truth (GT) DDD data of a dislocation gliding in a őeld of obstacles.…”
Section: Introductionmentioning
confidence: 99%