Loss functions express the loss to society, incurred through the use of a product, in monetary units. Underlying this concept is the notion that any deviation from target of any product characteristic implies a degradation in the product performance and hence a loss. Spiring (1993), in response to criticisms of the quadratic loss function, developed the reflected normal loss function, which is based on the normal density function. We give some modifications of these loss functions to simplify their application and provide a framework for the reflected normal loss function that accomodates a broader class of symmetric loss situations. These modifications also facilitate the unification of both of these loss functions and their comparison through expected loss. Finally, we give a simple method for determing the parameters of the modified reflected normal loss function based on loss information for multiple values of the product characteristic, and an example to illustrate the flexibility of the proposed model and the determination of its parameters.
The authors of this paper present a quantitative insight of a long argued question in hard disk drive (HDD) industry about the reliability effects of the number of head-disk interfaces (HDI). The competition between complexity and data transfer load is modeled from system reliability perspective: competing components with load sharing. Product failure probability ratio and steady-state MTTF ratio between different data storage capacities are derived in terms of their head-disk interface number ratio and data transfer ratio. It is found that the reliability dominance of these two factors is conditional to the mathematical characteristics of their governing failure physics. The detailed discussion is conducted on the system reliability with head-disk interface failures governed by Weibull life distribution and Inverse Power Law stress-life relationship.
Some life tests result in few or no failures. In such cases, we can, and should, consider using degradation measurements to assess reliability. In real world, product degradation is a stochastic process. Since such degradation is often seen as a monotonous process, literature widely uses Gamma Process to describe and quantify degradation. However, in these publications, scale parameter is considered constant over time and results under this assumption may have big deviation from the actual measurements under nonlinear condition. The purpose of this paper is to improve Gamma Process method to fit a broader class of degradation models. Firstly, we use MLE to estimate the parameters under the timely constant-scale-parameter assumption and analyze why the model does not fit data well. Then we propose an improved model to improve the method and use Monte Carlo simulation to verify the validity of the improved method.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.