Abstract-One goal of radar remote sensing is the extraction of terrain statistics and surface dielectric properties from backscatter data for some range of wavelengths, incidence angles, and polarizations. This paper addresses empirical approaches used to estimate terrain properties from radar data over a wider range of roughness than permitted by analytical models. Many empirical models assume, at least implicitly, that roughness parameters like rms height or correlation length are independent of the horizontal length scale over which they are measured, in contrast to recent surveys of natural terrain, which show that self-affine, or powerlaw scaling, between horizontal scale and roughness statistics is very common. The rms slope at the horizontal scale of the illuminating wavelength s(A) is directly related to the variogram or structure function of a self-affine surface, can be readily obtained from field-measured topography, and, when used in an empirical model, avoids the need for arbitrary wavelength-dependent terms. To facilitate comparison with earlier approaches, an expression that links the rms height at some profile length with the rms (Allan) deviation at an equivalent horizontal sampling interval is obtained from numerical simulations. An empirical model for polarimetric scattering as a function of 6'(A) at 35°-60° incidence from smooth to rugged lava surfaces is derived and compared with earlier models for backscatter from modestly rough soil surfaces. The asymptotic behavior of polarization ratios for the lava flows suggests that the depolarization of linear-polarized illuminating signals occurs as a first-order process, likely through single scattering by rock edges or other discontinuities, rather than as the solely multiple-scattering effect predicted by some analytical models. Efforts to fully understand radar scattering from geological surfaces need to incorporate wavelength-scale roughness, perhaps through computational simulations.