In this paper, we present a theoretical framework for interpreting the hot-spot electron temperature (Te) inferred from hard (10- to 20-keV) x-ray continuum emission for inertial confinement fusion implosions on OMEGA. We first show that the inferred Te represents the emission-weighted, harmonic mean of the hot-spot Te distribution, both spatially and temporally. A scheme is then provided for selecting a photon energy of which the emission weighting approximates neutron weighting. Simulations are then used to quantify the predicted relationship between the inferred Te, neutron-weighted Ti, and implosion performance on OMEGA. In an ensemble of 1-D simulations, it was observed that hot-spot thermal nonequilibrium precluded a sufficiently unique mapping between the inferred Te and neutron-weighted Ti. The inferred Te and hard x-ray yield's sensitivity to implosion asymmetry was studied using a 3-D simulation case study with low-harmonic-mode perturbations (i.e., laser beam power imbalance, target offset, and beam port geometry departures from spherical symmetry) and laser imprint (lmax = 200).