Statistics of fade slope at 50 GHz have been derived from one year of slant path attenuation measurements, using time interval lengths between 2 and 200 s. From these statistics, the dependence of fade slope on the attenuation level and time interval can be quantified. It has been found that the ITU-R model, specified for frequencies up to 30 GHz, fits well the experimental conditional distributions. Nevertheless, some divergences are observed for the lowest attenuations, which may be attributed to cloud effects. Such results are required for the design of communications systems operating in the -band.