2022
DOI: 10.1016/j.cma.2022.115128
|View full text |Cite
|
Sign up to set email alerts
|

Accelerating phase-field predictions via recurrent neural networks learning the microstructure evolution in latent space

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
23
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 46 publications
(23 citation statements)
references
References 67 publications
0
23
0
Order By: Relevance
“…RNN is more effective in learning nonlinear characteristics of sequences because they share parameters, are remembered, and are Turing complete. RNN is first used to describe the relationship between a sequence's current output and its past information [ 25 , 26 ].…”
Section: Models and Evaluation Methodsmentioning
confidence: 99%
“…RNN is more effective in learning nonlinear characteristics of sequences because they share parameters, are remembered, and are Turing complete. RNN is first used to describe the relationship between a sequence's current output and its past information [ 25 , 26 ].…”
Section: Models and Evaluation Methodsmentioning
confidence: 99%
“…[33] Earlier studies have attempted to improve phase field predictions using recurrent neural networks (RNN) such as long short-term memory and the gated recurrent unit, predicting latent features for the phase field method. [21,34,35] These precedents have leveraged the sequential prediction capability of RNNs. Using sequential structures produced by phase field simulations, these works used batches of initial structures, formed using the phase field method, to predict features in future time steps of the simulation.…”
Section: Deep Learning Approachesmentioning
confidence: 99%
“…Numerous successful precedents demonstrated the benefits of integrating AI with phase field simulations. These included the utilization of machine learning to construct analytical expressions, [ 17 ] the representation of high‐dimensional free energy surfaces using deep neural networks, [ 18 ] the application of principal component analysis for the analysis of the outcome of phase field modeling, [ 19 ] the prediction of the time evolution of phase field modeling through neural networks, [ 20–22 ] the integration of neural networks into the design pipeline of electromigration parameters, [ 23 ] etc. These excellent results show the significant potential of employing AI algorithms to advance the study of phase field models.…”
Section: Introductionmentioning
confidence: 99%
“…There exist a multitude of other options for dimensionality reduction of the feature space, such as isomap or kernel PCA. However, the nonlinear embeddings employed in these techniques can introduce distortions into the latent space that negate the benefits of PCA identified above (Hu et al, 2022).…”
Section: Materials Structure Representation and Quantificationmentioning
confidence: 99%