Uncertainty quantification is a primary challenge for reliable modeling and simulation of complex stochastic dynamics. Such problems are typically plagued with incomplete information that may enter as uncertainty in the model parameters, or even in the model itself. Furthermore, due to their dynamic nature, we need to assess the impact of these uncertainties on the transient and long-time behavior of the stochastic models and derive corresponding uncertainty bounds for observables of interest. A special class of such challenges is parametric uncertainties in the model and in particular sensitivity analysis along with the corresponding sensitivity bounds for stochastic dynamics. Moreover, sensitivity analysis can be further complicated in models with a high number of parameters that render straightforward approaches, such as gradient methods, impractical. In this paper, we derive uncertainty and sensitivity bounds for path-space observables of stochastic dynamics in terms of new goal-oriented divergences; the latter incorporate both observables and information theory objects such as the relative entropy rate. These bounds are tight, depend on the variance of the particular observable and are computable through Monte Carlo simulation. In the case of sensitivity analysis, the derived sensitivity bounds rely on the path Fisher Information Matrix, hence they depend only on local dynamics and are gradient-free. These features allow for computationally efficient implementation in systems with a high number of parameters, e.g., complex reaction networks and molecular simulations.Version: July 15, 2015 1 2. Uncertainty quantification information inequalities and sensitivity bounds.2.1. Distances and divergences of probability measures. Bounds of the type (1.1) are based on characterizing a distance or divergence between the measures, Q, P , under which the averages are evaluated. While our primary goal is to characterize the bounds based on relative entropy, other divergences can be c>0
We propose a new sensitivity analysis methodology for complex stochastic dynamics based on the Relative Entropy Rate. The method becomes computationally feasible at the stationary regime of the process and involves the calculation of suitable observables in path space for the Relative Entropy Rate and the corresponding Fisher Information Matrix. The stationary regime is crucial for stochastic dynamics and here allows us to address the sensitivity analysis of complex systems, including examples of processes with complex landscapes that exhibit metastability, non-reversible systems from a statistical mechanics perspective, and high-dimensional, spatially distributed models. All these systems exhibit, typically non-gaussian stationary probability distributions, while in the case of high-dimensionality, histograms are impossible to construct directly. Our proposed methods bypass these challenges relying on the direct Monte Carlo simulation of rigorously derived observables for the Relative Entropy Rate and Fisher Information in path space rather than on the stationary probability distribution itself. We demonstrate the capabilities of the proposed methodology by focusing here on two classes of problems: (a) Langevin particle systems with either reversible (gradient) or non-reversible (non-gradient) forcing, highlighting the ability of the method to carry out sensitivity analysis in non-equilibrium systems; and, (b) spatially extended Kinetic Monte Carlo models, showing that the method can handle high-dimensional problems.
BackgroundStochastic modeling and simulation provide powerful predictive methods for the intrinsic understanding of fundamental mechanisms in complex biochemical networks. Typically, such mathematical models involve networks of coupled jump stochastic processes with a large number of parameters that need to be suitably calibrated against experimental data. In this direction, the parameter sensitivity analysis of reaction networks is an essential mathematical and computational tool, yielding information regarding the robustness and the identifiability of model parameters. However, existing sensitivity analysis approaches such as variants of the finite difference method can have an overwhelming computational cost in models with a high-dimensional parameter space.ResultsWe develop a sensitivity analysis methodology suitable for complex stochastic reaction networks with a large number of parameters. The proposed approach is based on Information Theory methods and relies on the quantification of information loss due to parameter perturbations between time-series distributions. For this reason, we need to work on path-space, i.e., the set consisting of all stochastic trajectories, hence the proposed approach is referred to as “pathwise”. The pathwise sensitivity analysis method is realized by employing the rigorously-derived Relative Entropy Rate, which is directly computable from the propensity functions. A key aspect of the method is that an associated pathwise Fisher Information Matrix (FIM) is defined, which in turn constitutes a gradient-free approach to quantifying parameter sensitivities. The structure of the FIM turns out to be block-diagonal, revealing hidden parameter dependencies and sensitivities in reaction networks.ConclusionsAs a gradient-free method, the proposed sensitivity analysis provides a significant advantage when dealing with complex stochastic systems with a large number of parameters. In addition, the knowledge of the structure of the FIM can allow to efficiently address questions on parameter identifiability, estimation and robustness. The proposed method is tested and validated on three biochemical systems, namely: (a) a protein production/degradation model where explicit solutions are available, permitting a careful assessment of the method, (b) the p53 reaction network where quasi-steady stochastic oscillations of the concentrations are observed, and for which continuum approximations (e.g. mean field, stochastic Langevin, etc.) break down due to persistent oscillations between high and low populations, and (c) an Epidermal Growth Factor Receptor model which is an example of a high-dimensional stochastic reaction network with more than 200 reactions and a corresponding number of parameters.
Background: Emerging pathogens such as Zika, chikungunya, Ebola, and dengue viruses are serious threats to national and global health security. Accurate forecasts of emerging epidemics and their severity are critical to minimizing subsequent mortality, morbidity, and economic loss. The recent introduction of chikungunya and Zika virus to the Americas underscores the need for better methods for disease surveillance and forecasting.Methods: To explore the suitability of current approaches to forecasting emerging diseases, the Defense Advanced Research Projects Agency (DARPA) launched the 2014–2015 DARPA Chikungunya Challenge to forecast the number of cases and spread of chikungunya disease in the Americas. Challenge participants (n=38 during final evaluation) provided predictions of chikungunya epidemics across the Americas for a six-month period, from September 1, 2014 to February 16, 2015, to be evaluated by comparison with incidence data reported to the Pan American Health Organization (PAHO). This manuscript presents an overview of the challenge and a summary of the approaches used by the winners.Results: Participant submissions were evaluated by a team of non-competing government subject matter experts based on numerical accuracy and methodology. Although this manuscript does not include in-depth analyses of the results, cursory analyses suggest that simpler models appear to outperform more complex approaches that included, for example, demographic information and transportation dynamics, due to the reporting biases, which can be implicitly captured in statistical models. Mosquito-dynamics, population specific information, and dengue-specific information correlated best with prediction accuracy.Conclusion: We conclude that with careful consideration and understanding of the relative advantages and disadvantages of particular methods, implementation of an effective prediction system is feasible. However, there is a need to improve the quality of the data in order to more accurately predict the course of epidemics.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.