On the one hand, much of computational chemistry is concerned with "bottom-up" calculations which elucidate observable behavior starting from exact or approximated physical laws, a paradigm exemplified by typical quantum mechanical calculations and molecular dynamics simulations. On the other hand, "top down" computations aiming to formulate mathematical models consistent with observed data, e.g., parametrizing force fields, binding or kinetic models, have been of interest for decades but recently have grown in sophistication with the use of Bayesian inference (BI). Standard BI provides an estimation of parameter values, uncertainties, and correlations among parameters. Used for "model selection," BI can also distinguish between model structures such as the presence or absence of individual states and transitions. Fortunately for physical scientists, BI can be formulated within a statistical mechanics framework, and indeed, BI has led to a resurgence of interest in Monte Carlo (MC) algorithms, many of which have been directly adapted from or inspired by physical strategies. Certain MC algorithms�notably procedures using an "infinite temperature" reference state�can be successful in a 5−20 parameter BI context which would be unworkable in molecular spaces of 10 3 coordinates and more. This Review provides a pedagogical introduction to BI and reviews key aspects of BI through a physical lens, setting the computations in terms of energy landscapes and free energy calculations and describing promising sampling algorithms. Statistical mechanics and basic probability theory also provide a reference for understanding intrinsic limitations of Bayesian inference with regard to model selection and the choice of priors.