Metamodels aim to approximate characteristics of functions or systems from the knowledge extracted on only a finite number of samples. In recent years kriging has emerged as a widely applied metamodeling technique for resource-intensive computational experiments. However its prediction quality is highly dependent on the size and distribution of the given training points. Hence, in order to build proficient kriging models with as few samples as possible adaptive sampling strategies have gained considerable attention. These techniques aim to find pertinent points in an iterative manner based on information extracted from the current metamodel. A review of adaptive schemes for kriging proposed in the literature is presented in this article. The objective is to provide the reader with an overview of the main principles of adaptive techniques, and insightful details to pertinently employ available tools depending on the application at hand. In this context commonly applied strategies are compared with regards to their characteristics and approximation capabilities. In light of these experiments, it is found that the success of a scheme depends on the features of a specific problem and the goal of the analysis. In order to facilitate the entry into adaptive sampling a guide is provided. All experiments described herein are replicable using a provided open source toolbox.
The objective of this article is to introduce a new method including model order reduction for the life prediction of structures subjected to cycling damage. Contrary to classical incremental schemes for damage computation, a non-incremental technique, the LATIN method, is used herein as a solution framework. This approach allows to introduce a PGD model reduction technique which leads to a drastic reduction of the computational cost. The proposed framework is exemplified for structures subjected to cyclic loading, where damage is considered to be isotropic and micro-defect closure effects are taken into account. A difficulty herein for the use of the LATIN method comes from the state laws which can not be transformed into linear relations through an internal variable transformation. A specific treatment of this issue is introduced in this work.
In structural analysis with multivariate random fields, the underlying distribution functions, the autocorrelations, and the crosscorrelations require an extensive quantification. While those parameters are difficult to measure in experiments, a lack of knowledge is included. Therefore, polymorphic uncertainty models are attained by involving uncertainty models with epistemic characteristic for the quantification of the stochastic models in this contribution. Three extensions for random fields with polymorphic uncertainty modeling are introduced. Interval probability based random fields, fuzzy probability based random fields, and structural dependent autocorrelations for random fields are shown. Applications for engineering problems are shown for each extension, where uncertainty analysis of structures with different materials is performed. In this contribution, a damage simulation of a concrete beam with interval valued parametrization of stochastic models, an application for porous media in a multiphysical structural analysis with fuzzy valued parametrization and an uncertainty analysis with structural dependent autocorrelations for timber structures are presented.
One of the challenges of fatigue simulation using continuum damage mechanics framework over the years has been reduction of numerical cost while maintaining acceptable accuracy. The extremely high numerical expense is due to the temporal part of the quantities of interest which must reflect the state of a structure that is subjected to exorbitant number of load cycles. A novel attempt here is to present a non-incremental LATIN-PGD framework incorporating temporal model order reduction. LATIN-PGD method is based on separation of spatial and temporal parts of the mechanical variables, thereby allowing for separate treatment of the temporal problem. The internal variables, especially damage, although extraneous to the variable separation, must also be treated in a tactical way to reduce numerical expense. A temporal multi-scale approach is proposed that is based on the idea that the quantities of interest show a slow evolution along the cycles and a rapid evolution within the cycles. This assumption boils down to a finite element like discretisation of the temporal domain using a set of "nodal cycles" defined on the slow time scale. Within them, the quantities of interest must satisfy the global admissibility conditions and constitutive relations with respect to the fast time scale. Thereafter, information of the "nodal cycles" can be interpolated to simulate the behaviour on the whole temporal domain. This numerical strategy is tested on different academic examples and leads to an extreme reduction in numerical expense.
In order to regard mixed aleatory and epistemically uncertain random fields within stochastic finite element method, a probability box approach using stochastic collocation method is introduced. The influence of an interval-valued correlation length on the output is investigated.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.