Systems Biology models reveal relationships between signaling inputs and observable molecular or cellular behaviors. The complexity of these models, however, often obscures key elements that regulate emergent properties. We use a Bayesian model reduction approach that combines Parallel Tempering with Lasso regularization to identify minimal subsets of reactions in a signaling network that are sufficient to reproduce experimentally observed data. The Bayesian approach finds distinct reduced models that fit data equivalently. A variant of this approach based on Group Lasso is applied to the NF-κB signaling network to test the necessity of feedback loops for responses to pulsatile and continuous pathway stimulation. Taken together, our results demonstrate that Bayesian parameter estimation combined with regularization can isolate and reveal core motifs sufficient to explain data from complex signaling systems.August 1, 2019 1/23 separations in the reaction kinetics to apply model reduction based on quasi-steady-state and related approximations [10,14]. Both of these methods generate reduced models but do not carry out parameter estimation to fit experimental data. Another approach to model reduction that includes parameter estimation [15] uses mixed-integer nonlinear optimization to combine parameter estimation with model reduction by reaction elimination, a technique common in the field of chemical engineering [16,17]. Drawbacks of this approach are that it requires an additional binary parameter for every reaction in the model and that the genetic algorithms used for the optimization only provide point estimates of the parameters.Here, we develop reaction elimination in a Bayesian framework that combines parameter estimation and model reduction without requiring additional parameters. BPE has been shown to be useful to characterize high-dimensional, rugged, multimodal parameter landscapes common to systems biology models [1,[18][19][20][21], but it suffers from the drawback that the Markov Chain Monte Carlo (MCMC) methods commonly used to sample model parameter space are often slow to converge and do not scale well with the number of model parameters. We recently showed that Parallel Tempering (PT), a physics-based method for accelerating MCMC [22], outperforms conventional MCMC for systems biology models with up to dozens of parameters [18]. Here, we apply Lasso (also known as L1 regularization), a penalty on the absolute values of the parameters being optimized, to carry out model reduction. In the fields of statistics and machine learning, Lasso is widely used for variable selection to identify a parsimonious model -a minimal subset of variables required to explain the data [23]. In the context of biology, Lasso has been widely applied to gene expression and genomic data typically in combination with standard regression techniques [24][25][26][27][28] and less commonly in Bayesian frameworks [29,30]. In the mechanistic modeling context, Lasso regression has been used to predict cell type specific parameters in ...