This paper proposes a method to assess the local in¯uence in a minor perturbation of a statistical model with incomplete data. The idea is to utilize Cook's approach to the conditional expectation of the complete-data log-likelihood function in the EM algorithm. It is shown that the method proposed produces analytic results that are very similar to those obtained from a classical local in¯uence approach based on the observed data likelihood function and has the potential to assess a variety of complicated models that cannot be handled by existing methods. An application to the generalized linear mixed model is investigated. Some illustrative arti®cial and real examples are presented.
A general nonlinear structural equation model with mixed continuous and polytomous variables is analysed. A Bayesian approach is proposed to estimate simultaneously the thresholds, the structural parameters and the latent variables. To solve the computational difficulties involved in the posterior analysis, a hybrid Markov chain Monte Carlo method that combines the Gibbs sampler and the Metropolis-Hasting algorithm is implemented to produce the Bayesian solution. Statistical inferences, which involve estimation of parameters and their standard errors, residuals and outliers analyses, and goodness-of-fit statistics for testing the posited model, are discussed. The proposed procedure is illustrated by a simulation study and a real example.
We propose a two-stage algorithm for computing maximum likelihood estimates for a class of spatial models. The algorithm combines Markov chain Monte Carlo methods such as the Metropolis±Hastings±Green algorithm and the Gibbs sampler, and stochastic approximation methods such as the off-line average and adaptive search direction. A new criterion is built into the algorithm so stopping is automatic once the desired precision has been set. Simulation studies and applications to some real data sets have been conducted with three spatial models. We compared the algorithm proposed with a direct application of the classical Robbins±Monro algorithm using Wiebe's wheat data and found that our procedure is at least 15 times faster.
We establish asymptotic theory for both the maximum likelihood and the maximum modified likelihood estimators in mixture regression models. Moreover, under specific and reasonable conditions, we show that the optimal convergence rate of "n"-super- - 1/4 for estimating the mixing distribution is achievable for both the maximum likelihood and the maximum modified likelihood estimators. We also derive the asymptotic distributions of two log-likelihood ratio test statistics for testing homogeneity and we propose a resampling procedure for approximating the "p"-value. Simulation studies are conducted to investigate the empirical performance of the two test statistics. Finally, two real data sets are analysed to illustrate the application of our theoretical results. Copyright 2004 Royal Statistical Society.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.