Wildfires greatly increase a landscape's vulnerability to flooding and erosion events by removing vegetation and changing soils. Fire damage to soil increases with increasing soil temperature, and, for fires where smoldering combustion is absent, the current understanding is that soil temperatures increase as fuel load and fire intensity increase. Here, however, we show that this understanding that is based on experiments under homogeneous conditions does not necessarily apply at the more relevant larger scale where soils, vegetation, and fire characteristics are heterogeneous. In a catchment‐scale fire experiment, soils were surprisingly cool where fuel load was high and fire was hot and, conversely, soils were hot where expected to be cooler. This indicates that the greatest fire damage to soil can occur where fuel load and fire intensity are low rather than high, and has important implications for management of fire‐prone areas prior to, during, and after fire events.