2021
DOI: 10.1088/2632-2153/ac3ffa
|View full text |Cite
|
Sign up to set email alerts
|

Building high accuracy emulators for scientific simulations with deep neural architecture search

Abstract: Computer simulations are invaluable tools for scientific discovery. However, accurate simulations are often slow to execute, which limits their applicability to extensive parameter exploration, large-scale data analysis, and uncertainty quantification. A promising route to accelerate simulations by building fast emulators with machine learning requires large training datasets, which can be prohibitively expensive to obtain with slow simulations. Here we present a method based on neural architecture search to bui… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
54
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
3

Relationship

0
10

Authors

Journals

citations
Cited by 84 publications
(55 citation statements)
references
References 38 publications
1
54
0
Order By: Relevance
“…These authors hence also use the deviation of the network prediction from the model as an additional penalty term in the training loss. Kasim et al [25] recently demonstrated neural network-based simulations for a broad range of applications, including fluid dynamics, by also optimizing the network architecture itself during training. Eichinger, Heinlein and Klawonn [33] use convolutional neural networks and techniques from image processing to learn flow patterns for the Navier-Stokes flow around objects of different shape.…”
Section: Related Workmentioning
confidence: 99%
“…These authors hence also use the deviation of the network prediction from the model as an additional penalty term in the training loss. Kasim et al [25] recently demonstrated neural network-based simulations for a broad range of applications, including fluid dynamics, by also optimizing the network architecture itself during training. Eichinger, Heinlein and Klawonn [33] use convolutional neural networks and techniques from image processing to learn flow patterns for the Navier-Stokes flow around objects of different shape.…”
Section: Related Workmentioning
confidence: 99%
“…These authors hence also use the deviation of the network prediction from the model as an additional penalty term in the training loss. Kasim et al [29] recently demonstrated neural network-based simulations for a broad range of applications, including fluid dynamics, by also optimizing the network architecture itself during train-ing process. Eichinger, Heinlein and Klawonn [33] use convolutional neural networks and techniques from image processing to learn flow patterns for the Navier-Stokes flow around objects of different shape.…”
Section: Related Workmentioning
confidence: 99%
“…Large initial-condition and perturbed parameter ensembles might contain hundreds of training points, while multi-model ensembles only contain tens. Using neural architecture search [46] or simpler models can help relieve this to some extent.…”
Section: (A) Challenges (I) Small Training Data Sizesmentioning
confidence: 99%