Absorption spectroscopy is traditionally used to determine the average gas temperature and species concentration along the laser line-of-sight by measuring the magnitude of two or more absorption transitions with different temperature dependence. Previous work has shown that the nonlinear temperature dependence of the absorption strength of each transition, set by the lowerstate energy, E", can be used to infer temperature variations along the laser line-of-sight. In principle, measuring more absorption transitions with broader bandwidth light sources improves the ability to resolve temperature variations. Here, we introduce a singular value decomposition framework in order to explore the theoretical limits to resolving temperature distributions with single-beam line-of-sight absorption measurements. We show that in the absence of measurement noise or error, only the first ∼14 well-selected absorption features improve the temperature resolution, and a Tikhonov regularization method improves the accuracy of the temperature inversion, particularly for recovery of the maximum gas temperature along the laser beam. We use inversion simulations to demonstrate that one can resolve a selection of temperature distributions along a laser beam line-of-sight to within 3% for the sample cases analyzed. In part II of this work, we explore the influence of measurement noise and error, and experimentally demonstrate the technique to show that there is benefit to measuring additional absorption transitions under real conditions.