Extraordinary amounts of data are being produced in many branches of science. Proven statistical methods are no longer applicable with extraordinary large data sets due to computational limitations. A critical step in big data analysis is data reduction. Existing investigations in the context of linear regression focus on subsampling-based methods. However, not only is this approach prone to sampling errors, it also leads to a covariance matrix of the estimators that is typically bounded from below by a term that is of the order of the inverse of the subdata size. We propose a novel approach, termed information-based optimal subdata selection (IBOSS). Compared to leading existing subdata methods, the IBOSS approach has the following advantages: (i) it is significantly faster; (ii) it is suitable for distributed parallel computing; (iii) the variances of the slope parameter estimators converge to 0 as the full data size increases even if the subdata size is fixed, i.e., the convergence rate depends on the full data size; (iv) data analysis for IBOSS subdata is straightforward and the sampling distribution of an IBOSS estimator is easy to assess. Theoretical results and extensive simulations demonstrate that the IBOSS approach is superior to subsampling-based methods, sometimes by orders of magnitude. The advantages of the new approach are also illustrated through analysis of real data.
Deriving optimal designs for nonlinear models is challenging in general. Although some recent results allow us to focus on a simple subclass of designs for most problems, deriving a specific optimal design mainly depends on algorithmic approaches. There is need of a general and efficient algorithm which is more broadly applicable than the current state of the art methods. We present a new algorithm that can be used to find optimal designs with respect to a broad class of optimality criteria, when the model parameters or functions thereof are of interest, and for both locally optimal and multistage design strategies. We prove convergence to the desired optimal design, and show that the new algorithm outperforms the best available algorithm in various examples.
Deriving optimal designs for nonlinear models is in general challenging. One crucial step is to determine the number of support points needed. Current tools handle this on a case-by-case basis. Each combination of model, optimality criterion and objective requires its own proof. The celebrated de la Garza Phenomenon states that under a (p − 1)th-degree polynomial regression model, any optimal design can be based on at most p design points, the minimum number of support points such that all parameters are estimable. Does this conclusion also hold for nonlinear models? If the answer is yes, it would be relatively easy to derive any optimal design, analytically or numerically. In this paper, a novel approach is developed to address this question. Using this new approach, it can be easily shown that the de la Garza phenomenon exists for many commonly studied nonlinear models, such as the Emax model, exponential model, three-and four-parameter log-linear models, Emax-PK1 model, as well as many classical polynomial regression models. The proposed approach unifies and extends many well-known results in the optimal design literature. It has four advantages over current tools: (i) it can be applied to many forms of nonlinear models; to continuous or discrete data; to data with homogeneous or non-homogeneous errors; (ii) it can be applied to any design region; (iii) it can be applied to multiple-stage optimal design; and (iv) it can be easily implemented.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.