We propose a model that describes the signal fading process due to scintillation in the presence of rain. We analyzed a data set of up-link (30 GHz) and down-link (20 GHz) attenuation values averaged over 1 second intervals. The data are samples relative to 10 significant events, for a total of 180,000 s, recorded at the Spino d'Adda (North of Italy) station using the Olympus satellite.Our analysis is based on the fact that the plot of attenuation versus time recalls the behaviour of a self-similar process. We then make various considerations and propose a fractional Brownian motion model for the scintillation process. We describe the model in detail, with pictures showing the apparent self-similarity of the measured data. We then show that the Hurst parameter of the process is a simple function of the rain fade. We describe a method for producing random data that interpolate the measured samples, while preserving some of their interesting statistical properties. This method can be used for simulating fade countermeasure systems.As a possible application of the model, we show how to optimise fade measurement times for fade countermeasure systems.Nedo Celandroni, Francesco Potortì: Modeling Ka band scintillation as a fractal process