Tukey's lambda distribution is generalized to provide an algorithm for generating values of unimodal asymmetric random variables. This algorithm has the same advantages as the symmetric random variable generator previously given by the authors, except that the addition of another parameter complicates the problem of finding the parameter values to fit a distribution.
Batching is a commonly used method for calculating confidence intervals on the mean of a sequence of correlated observations arising from a simulation experiment. Several recent papers have considered the effect of using batch sizes too small to satisfy assumptions of normality and/or independence, and the resulting incorrect probabilities of the confidence interval covering the mean. This paper quantifies the effects of using batch sizes larger than necessary to satisfy normality and independence assumptions. These effects include (1) correct probability of covering the mean, (2) an increase in expected half length, (3) an increase in the standard deviation and coefficient of variation of the half length, and (4) an increase in the probability of covering points not equal to the mean. For any sample size and independent and normal batch means, the results are (1) the effects of less than 10 batches are large and the effects of more than 30 batches small, and (2) additional batches have lesser effects on confidence intervals having lower confidence levels. The results also are useful in the context of using independent replications for establishing confidence intervals on the mean.
Neural network (NN) based modeling often requires trying multiple networks with different architectures and training parameters in order to achieve an acceptable model accuracy. Typically, only one of the trained networks is selected as "best" and the rest are discarded. The authors propose using optimal linear combinations (OLC's) of the corresponding outputs on a set of NN's as an alternative to using a single network. Modeling accuracy is measured by mean squared error (MSE) with respect to the distribution of random inputs. Optimality is defined by minimizing the MSE, with the resultant combination referred to as MSE-OLC. The authors formulate the MSE-OLC problem for trained NN's and derive two closed-form expressions for the optimal combination-weights. An example that illustrates significant improvement in model accuracy as a result of using MSE-OLC's of the trained networks is included.
When an estimator of the variance of the sample mean is parameterized by batch size, one approach for selecting batch size is to pursue the minimal mean squared error (mse). We show that the convergence rate of the variance of the sample mean, and the bias of estimators of the variance of the sample mean, asymptotically depend on the data process only through its marginal variance and the sum of the autocorrelations weighted by their absolute lags. Combining these results with variance results of Goldsman and Meketon, we obtain explicit asymptotic approximations for mse, optimal batch size, optimal mse, and robustness for four quadratic-form estimators of the variance of the sample mean. Our empirical results indicate that the asymptotic approximations are reasonably accurate for sample sizes seen in practice. Although we do not discuss batch-size estimation procedures, the empirical results suggest that the explicit asymptotic batch-size approximation, which depends only on a summary measure (which we refer to as the balance point) of the nonnegative-lag autocorrelations, is a reasonable foundation for such procedures.simulation output analysis, estimation, overlapping batch means, standardized time series
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.