A new validation metric is proposed that combines the use of a threshold based on the uncertainty in the measurement data with a normalized relative error, and that is robust in the presence of large variations in the data. The outcome from the metric is the probability that a model's predictions are representative of the real world based on the specific conditions and confidence level pertaining to the experiment from which the measurements were acquired. Relative error metrics are traditionally designed for use with a series of data values, but orthogonal decomposition has been employed to reduce the dimensionality of data matrices to feature vectors so that the metric can be applied to fields of data. Three previously published case studies are employed to demonstrate the efficacy of this quantitative approach to the validation process in the discipline of structural analysis, for which historical data were available; however, the concept could be applied to a wide range of disciplines and sectors where modelling and simulation play a pivotal role.
A new decomposition algorithm based on QR factorization is introduced for processing and comparing irregularly shaped stress and deformation datasets found in structural analysis. The algorithm improves the comparison of two-dimensional data fields from the surface of components where data is missing from the field of view due to obstructed measurement systems or component geometry that results in areas where no data is present. The technique enables the comparison of these irregularly shaped datasets without the need for interpolation or warping of the data necessary in some other decomposition techniques, for example, Chebyshev or Zernike decomposition. This ensures comparisons are only made between the available data in each dataset and thus similarity metrics are not biased by missing data. The decomposition and comparison technique has been applied during an impact experiment, a modal analysis, and a fatigue study, with the stress and displacement data obtained from finite-element analysis, digital image correlation and thermoelastic stress analysis. The results demonstrate that the technique can be used to process data from a range of sources and suggests the technique has the potential for use in a wide variety of applications.
A novel methodology is introduced for quantifying the severity of damage created during testing in composite components. The method uses digital image correlation combined with image processing techniques to monitor the rate at which the strain field changes during mechanical tests. The methodology is demonstrated using two distinct experimental datasets, a ceramic matrix composite specimen loaded in tension at high temperature and nine polymer matrix composite specimens containing fibre-waviness defects loaded in bending. The changes in the strain field owing to damage creation are shown to be a more effective indicator that the specimen has reached its proportional limit than using load-extension diagrams. The technique also introduces a new approach to using experimental data for creating maps indicating the spatio-temporal distribution of damage in a component. These maps indicate where damage occurs in a component, and provide information about its morphology and its time of occurrence. This presentation format is both easier and faster to interpret than the raw data which, for some tests, can consist of tens of thousands of images. This methodology has the potential to reduce the time taken to interpret large material test datasets while increasing the amount of knowledge that can be extracted from each test.
Computational models of structures are widely used to inform decisions about design, maintenance and operational life of engineering infrastructure, including airplanes. Confidence in the predictions from models is provided via validation processes that assess the extent to which predictions represent the real world, where the real world is often characterised by measurements made in experiments of varying sophistication dependent on the importance of the decision that the predictions will inform. There has been steady progress in developing validation processes that compare fields of predictions and measurements in a quantitative manner using the uncertainty in measurements as a basis for assessing the importance of differences between the fields of data. In this case study, three recent advances in a validation process, which was evaluated in an inter-laboratory study 5 years ago, are implemented using a ground-test on a fuselage at the aircraft manufacturer’s site for the first time. The results show that the advances successfully address the issues raised by the inter-laboratory study, that the enhanced validation process can be implemented in an industrial environment on a complex structure, and that the model was an excellent representation of the measurements made using digital image correlation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.