Radiative cooling effects behind a reflected shock wave are calculated for an absorbing-emitting gas by means of an expansion procedure in the small density ratio ε across the shock front. For a gray gas shock layer with an optical thickness of order unity or less the absorption integral is simplified by use of the local temperature approximation, whereas for larger optical thicknesses a Rosseland diffusion type of solution is matched with the local temperature approximation solution. The calculations show that the shock wave will attenuate at first and then accelerate to a constant velocity. Under appropriate conditions the gas enthalpy near the wall may increase at intermediate times before ultimately decreasing to zero. A two-band absorption model yields end-wall radiant-heat fluxes which agree well with available shock-tube measurements.