Parameter estimation is an important topic in the field of system identification. This paper explores the role of a new information theory measure of data dependency in parameter estimation problems. Causation entropy is a recently proposed information-theoretic measure of influence between components of multivariate time series data. Because causation entropy measures the influence of one dataset upon another, it is naturally related to the parameters of a dynamical system. In this paper, it is shown that by numerically estimating causation entropy from the outputs of a dynamic system, it is possible to uncover the internal parametric structure of the system and thus establish the relative magnitude of system parameters. In the simple case of linear systems subject to Gaussian uncertainty, it is first shown that causation entropy can be represented in closed form as the logarithm of a rational function of system parameters. For more general systems, a causation entropy estimator is proposed, which allows causation entropy to be numerically estimated from measurement data. Results are provided for discrete linear and nonlinear systems, thus showing that numerical estimates of causation entropy can be used to identify the dependencies between system states directly from output data. Causation entropy estimates can therefore be used to inform parameter estimation by reducing the size of the parameter set or to generate a more accurate initial guess for subsequent parameter optimization.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.