A Julia language (Bezanson et al., 2017) package providing practical and modular implementation of "Calibrate, Emulate, Sample" (Cleary et al., 2021), hereafter CES, an accelerated workflow for obtaining model parametric uncertainty is presented. This is also known as Bayesian inversion or uncertainty quantification. To apply CES one requires a computer model (written in any programming language) dependent on free parameters, a prior distribution encoding some prior knowledge about the distribution over the free parameters, and some data with which to constrain this prior distribution. The pipeline has three stages, most easily explained in reverse:1. The goal of the workflow is to draw samples (Sample) from the Bayesian posterior distribution, that is, the prior distribution conditioned on the observed data, 2. To accelerate and regularize sampling we train statistical emulators to represent the user-provided parameter-to-data map (Emulate), 3. The training points for these emulators are generated by the computer model, and selected adaptively around regions of high posterior mass (Calibrate).We describe CES as an accelerated workflow, as it is often able to use dramatically fewer evaluations of the computer model when compared with applying sampling algorithms, such as Markov chain Monte Carlo (MCMC), directly.