2018 AIAA Non-Deterministic Approaches Conference 2018
DOI: 10.2514/6.2018-0927
|View full text |Cite
|
Sign up to set email alerts
|

The role of data analysis in uncertainty quantification: case studies for materials modeling

Abstract: In computational materials science, mechanical properties are typically extracted from simulations by means of analysis routines that seek to mimic their experimental counterparts. However, simulated data often exhibit uncertainties that can propagate into final predictions in unexpected ways.Thus, modelers require data analysis tools that (i) address the problems posed by simulated data, and (ii) facilitate uncertainty quantification. In this manuscript, we discuss three case studies in materials modeling whe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 28 publications
0
5
0
Order By: Relevance
“…12 It is important to recognize that time-series data usually displays a certain amount of autocorrelation in the sense that the numerical values of nearby points in the series tend to cluster close to one another. Intuition dictates that correlated data does not reveal fully “new” information about the quantity-of-interest, and so we require uncorrelated samples to achieve meaningful sampling [14]. 13 …”
Section: Pre-simulation “Sanity Checks” and Planning Tipsmentioning
confidence: 99%
See 1 more Smart Citation
“…12 It is important to recognize that time-series data usually displays a certain amount of autocorrelation in the sense that the numerical values of nearby points in the series tend to cluster close to one another. Intuition dictates that correlated data does not reveal fully “new” information about the quantity-of-interest, and so we require uncorrelated samples to achieve meaningful sampling [14]. 13 …”
Section: Pre-simulation “Sanity Checks” and Planning Tipsmentioning
confidence: 99%
“…12 to estimate the uncertainty. Strictly speaking, the standard uncertainty should be estimated using the true standard deviation of x (e.g., σ x ); given that the true standard deviation is unknown, the experimental standard deviation is used in its place as an estimate of σ x [14]. …”
Section: Computing Error In Specific Observablesmentioning
confidence: 99%
“…The inset shows the errors in these measurements relative to the HCS, which remarkably agree to within 6 % for all measurements. write 0 ≤ x, y ≤ a ± a (33) for some constant a and its associated uncertainty a . For our purposes, however, it is more convenient to express the upper bound in terms of the relative uncertainty…”
Section: On Model-form Errorsmentioning
confidence: 99%
“…Now, in general there is no unique convex function that fits a set of data by minimizing an objective function such as the sum-of-squared-differences. [33] However, the piecewise linear function that interpolates theÎ i is an upper bound on all such convex functions. Moreover, in the case of a dense grid of dosages ξ i , the variation between all such curves is negligible.…”
Section: Appendix B: Convex Reconstructions Of the Master Curvementioning
confidence: 99%
“…Recently we implemented convex analysis of the stress strain curve (as described here [PKD18]). scipy.optimize.minimize is used for a constrained minimization with boundary conditions of a function related to the stress strain curve.…”
Section: Data Fitting Algorithms and Use Casesmentioning
confidence: 99%