A new reliability analysis method is proposed for time-dependent problems with explicit in time limit-state functions of input random variables and input random processes using the total probability theorem and the concept of composite limit state. The input random processes are assumed Gaussian. They are expressed in terms of standard normal variables using a spectral decomposition method. The total probability theorem is employed to calculate the time-dependent probability of failure using time-dependent conditional probabilities which are computed accurately and efficiently in the standard normal space using the first-order reliability method (FORM) and a composite limit state of linear instantaneous limit states. If the dimensionality of the total probability theorem integral is small, we can easily calculate it using Gauss quadrature numerical integration. Otherwise, simple Monte Carlo simulation (MCS) or adaptive importance sampling are used based on a Kriging metamodel of the conditional probabilities. An example from the literature on the design of a hydrokinetic turbine blade under time-dependent river flow load demonstrates all developments.
The genetic algorithm paradigm is a well-known heuristic for solving many problems in science and engineering in which candidate solutions, or "individuals", are manipulated in ways analogous to biological evolution, to produce new solutions until one with the desired quality is found. As problem sizes increase, a natural question is how to exploit advances in distributed and parallel computing to speed up the execution of genetic algorithms. This thesis proposes a new distributed architecture for genetic algorithms, based on distributed storage of the individuals in a persistent pool. Processors extract individuals from the pool in order to perform the computations and then insert the resulting individuals back into the pool. Unlike previously proposed approaches, the new approach is tailored for distributed systems in which processors are loosely coupled, failure-prone and can run at different speeds. Proof-of-concept simulation results are presented for four benchmark functions and for a real-world Product Lifecycle Design problem. We have experimented with both the crash failure model and the Byzantine failure model. The results indicate that the approach can deliver improved performance due to the distribution and tolerates a large fraction of processor failures subject to both models.
A new reliability analysis method is proposed for time-dependent problems with limit-state functions of input random variables, input random processes and explicit in time using the total probability theorem and the concept of composite limit state. The input random processes are assumed Gaussian. They are expressed in terms of standard normal variables using a spectral decomposition method. The total probability theorem is employed to calculate the time-dependent probability of failure using a time-dependent conditional probability which is computed accurately and efficiently in the standard normal space using FORM and a composite limit state of linear instantaneous limit states. If the dimensionality of the total probability theorem integral (equal to the number of input random variables) is small, we can easily calculate it using Gauss quadrature numerical integration. Otherwise, simple Monte Carlo simulation or adaptive importance sampling is used based on a pre-built Kriging metamodel of the conditional probability. An example from the literature on the design of a hydrokinetic turbine blade under time-dependent river flow load demonstrates all developments.
We present a new experiment demonstrating destructive interference in customers’ estimates of conditional probabilities of product failure. We take the perspective of a manufacturer of consumer products and consider two situations of cause and effect. Whereas, individually, the effect of the causes is similar, it is observed that when combined, the two causes produce the opposite effect. Such negative interference of two or more product features may be exploited for better modeling of the cognitive processes taking place in customers’ minds. Doing so can enhance the likelihood that a manufacturer will be able to design a better product, or a feature within it. Quantum probability has been used to explain some commonly observed “non-classical” effects, such as the disjunction effect, question order effect, violation of the sure-thing principle, and the Machina and Ellsberg paradoxes. In this work, we present results from a survey on the impact of multiple observed symptoms on the drivability of a vehicle. The symptoms are assumed to be conditionally independent. We demonstrate that the response statistics cannot be directly explained using classical probability, but quantum formulation easily models it, as it allows for both positive and negative “interference” between events. Since quantum formalism also accounts for classical probability’s predictions, it serves as a richer paradigm for modeling decision making behavior in engineering design and behavioral economics.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.