Mechanism-based mathematical models are the foundation for diverse applications. It is often critical to explore the massive parametric space for each model. However, for many applications, such as agent-based models, partial differential equations, and stochastic differential equations, this practice can impose a prohibitive computational demand. To overcome this limitation, we present a fundamentally new framework to improve computational efficiency by orders of magnitude. The key concept is to train an artificial neural network using a limited number of simulations generated by a mechanistic model. This number is small enough such that the simulations can be completed in a short time frame but large enough to enable reliable training of the neural network. The trained neural network can then be used to explore the system dynamics of a much larger parametric space. We demonstrate this notion by training neural networks to predict self-organized pattern formation and stochastic gene expression. With this framework, we can predict not only the 1-D distribution in space (for partial differential equation models) and probability density function (for stochastic differential equation models) of variables of interest with high accuracy, but also novel system dynamics not present in the training sets. We further demonstrate that using an ensemble of neural networks enables the self-contained evaluation of the quality of each prediction. Our work can potentially be a platform for faster parametric space screening of biological models with user defined objectives.