Understanding the dynamics and physics of climate extremes will be a critical challenge for twenty-first-century climate science. Increasing temperatures and saturation vapor pressures may exacerbate heat waves, droughts, and precipitation extremes. Yet our ability to monitor temperature variations is limited and declining. Between 1983 and 2016, the number of observations in the University of East Anglia Climatic Research Unit (CRU) Tmax product declined precipitously (5900 → 1000); 1000 poorly distributed measurements are insufficient to resolve regional Tmax variations. Here, we show that combining long (1983 to the near present), high-resolution (0.05°), cloud-screened archives of geostationary satellite thermal infrared (TIR) observations with a dense set of ~15 000 station observations explains 23%, 40%, 30%, 41%, and 1% more variance than the CRU globally and for South America, Africa, India, and areas north of 50°N, respectively; even greater levels of improvement are shown for the 2011–16 period (28%, 45%, 39%, 52%, and 28%, respectively). Described here for the first time, the TIR Tmax algorithm uses subdaily TIR distributions to screen out cloud-contaminated observations, providing accurate (correlation ≈0.8) gridded emission Tmax estimates. Blending these gridded fields with ~15 000 station observations provides a seamless, high-resolution source of accurate Tmax estimates that performs well in areas lacking dense in situ observations and even better where in situ observations are available. Cross-validation results indicate that the satellite-only, station-only, and combined products all perform accurately (R ≈ 0.8–0.9, mean absolute errors ≈ 0.8–1.0). Hence, the Climate Hazards Center Infrared Temperature with Stations (CHIRTSmax) dataset should provide a valuable resource for climate change studies, climate extreme analyses, and early warning applications.