2019
DOI: 10.1016/j.commatsci.2019.02.046
|View full text |Cite
|
Sign up to set email alerts
|

Learning to fail: Predicting fracture evolution in brittle material models using recurrent graph convolutional neural networks

Abstract: We propose a machine learning approach to address a key challenge in materials science: predicting how fractures propagate in brittle materials under stress, and how these materials ultimately fail. Our methods use deep learning and train on simulation data from high-fidelity models, emulating the results of these models while avoiding the overwhelming computational demands associated with running a statistically significant sample of simulations. We employ a graph convolutional network that recognizes feature… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
46
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
2

Relationship

1
8

Authors

Journals

citations
Cited by 78 publications
(46 citation statements)
references
References 29 publications
0
46
0
Order By: Relevance
“…Pooling and normalization layers allow for stepwise data simplification and for variable feature sizes, respectively. While the suitability of CNNs for materials science may not be immediately apparent, there are examples of direct applications such as materials texture recognition (Cang and Ren, 2016;Lubbers et al, 2017;Cecen et al, 2018), as well as indirect application examples in which e.g., non-visual materials data may be interpreted (Schwarzer et al, 2019).…”
Section: Short Overview and Description Of Machine Learning And Data mentioning
confidence: 99%
See 1 more Smart Citation
“…Pooling and normalization layers allow for stepwise data simplification and for variable feature sizes, respectively. While the suitability of CNNs for materials science may not be immediately apparent, there are examples of direct applications such as materials texture recognition (Cang and Ren, 2016;Lubbers et al, 2017;Cecen et al, 2018), as well as indirect application examples in which e.g., non-visual materials data may be interpreted (Schwarzer et al, 2019).…”
Section: Short Overview and Description Of Machine Learning And Data mentioning
confidence: 99%
“…Instead, they used random forests and decision trees to efficiently perform an uncertainty quantification. Schwarzer et al (2019) circumvented the challenge of needing a large experimental dataset that is statistically meaningful via using a significant number of simulations; thus, the accuracy could be increased. For that, a deep neural network was used; specifically, a graph convolutional network for fracture feature recognition within the material.…”
Section: Predictivementioning
confidence: 99%
“…While the long short-term memory network [14] considers the global semantic information of the text from the character information, it only expands in time; thus, it cannot effectively capture the deeper level abstract features. Each node in the character-level convolutional layer [15][16][17], when extracting the character features of the named-entity, transmits the feature information obtained by itself to the next adjacent node after nonlinear variation. This is then passed onto multiple nodes nearby to achieve the accumulation of character information.…”
Section: Feature Learningmentioning
confidence: 99%
“…In our previous works [61,62], we have utilized machine learning approaches to efficiently emulate the high-fidelity model. The key QoIs Moore et.…”
Section: Introductionmentioning
confidence: 99%