Maximum Likelihood Estimation (MLE) is the bread and butter of system inference for stochastic systems. In some generality, MLE will converge to the correct model in the infinite data limit. In the context of physical approaches to system inference, such as Boltzmann machines, MLE requires the arduous computation of partition functions summing over all configurations, both observed and unobserved. We present a conceptually transparent data-driven inference computation based on a re-weighting of observed configuration frequencies that allows us to re-cast the inference problem as a simpler calculation. Modeling our approach on the hightemperature limit of statistical physics, we re-weight the frequencies of observed configurations by multiplying with reciprocals of Boltzmann weights and update the Boltzmann weights iteratively to make these products close to the high temperature limit of the Boltzmann weights. This converts the required partition function computation in the re-weighted MLE to a tractable leading order high temperature term. We show that this is a convex optimization at each step. Then, for systems with a large number of degrees of freedom where other approaches are intractable, we demonstrate that this data-driven algorithm gives accurate inference with both synthetic data and two real-world examples.