2022
DOI: 10.3389/fmats.2021.824441
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Exploration of Microstructure-Property Spaces via Active Learning

Abstract: In materials design, supervised learning plays an important role for optimization and inverse modeling of microstructure-property relations. To successfully apply supervised learning models, it is essential to train them on suitable data. Here, suitable means that the data covers the microstructure and property space sufficiently and, especially for optimization and inverse modeling, that the property space is explored broadly. For virtual materials design, typically data is generated by numerical simulations,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 11 publications
(7 citation statements)
references
References 60 publications
1
6
0
Order By: Relevance
“…The data that support the findings of this study are openly available at the following URL/DOI: https://fordatis.fraunhofer.de/handle/fordatis/319 [64].…”
Section: Data Availability Statementsupporting
confidence: 70%
See 1 more Smart Citation
“…The data that support the findings of this study are openly available at the following URL/DOI: https://fordatis.fraunhofer.de/handle/fordatis/319 [64].…”
Section: Data Availability Statementsupporting
confidence: 70%
“…In addition, the simulation framework is used to calculate material properties. The properties we focus on are the orientation dependent Young's moduli and anisotropy similar to R-values, as in [61]. R-values, however, are originally developed to express plastic anisotropy in sheet metals.…”
Section: Implementation Detailsmentioning
confidence: 99%
“…Besides constitutive modeling, the potential of using machine learning models is actively being investigated across multiple fields of computational material science. Among others, these methods are used in computational design of materials [5][6][7][8][9], in design of processes [10][11][12], in development of digital twins [13] and soft sensors [14], and in multi-scale simulations and homogenization schemes. Numerical prediction in these applications often involve frequent execution of simulations with variations of model parameters [15].…”
Section: Motivationmentioning
confidence: 99%
“…To tackle this issue, intelligent sampling algorithms, e.g. from the field of active learning [200], can be used for data generation and model training, see [201,9] and [82] for application examples. • Measuring prediction quality: Another effect of the large space of possible loading conditions is that learned constitutive models cannot be fitted accurately for any imaginable loading path.…”
Section: 6 Summary and Outlookmentioning
confidence: 99%
“…In this direction, physics-informed neural networks (PINNs) 5 7 have gained much attention for accurately solving the PDEs of the underlying physics. A knowledge-based sampling of inputs is another way of utilising the physics of the problem in model training 8 , 9 . In addition to prior knowledge infusion, the type of ANN architecture plays an essential role in effective and effortless learning.…”
Section: Introductionmentioning
confidence: 99%