Abstract-Microwave radiometry will emerge as an important tool for global remote sensing of near-surface soil moisture in the coming decade. In this modeling study, we find that hillslopescale topography (tens of meters) influences microwave brightness temperatures in a way that produces bias at coarser scales (kilometers). The physics underlying soil moisture remote sensing suggests that the effects of topography on brightness temperature observations are twofold: 1) the spatial distribution of vegetation, moisture, and surface and canopy temperature depends on topography and 2) topography determines the incidence angle and polarization rotation that the observing sensor makes with the local land surface. Here, we incorporate the important correlations between factors that affect emission (e.g., moisture, temperature, and vegetation) and topographic slope and aspect. Inputs to the radiative transfer model are obtained at hillslope scales from a mass-, energy-, and carbon-balance-resolving ecohydrology model. Local incidence and polarization rotation angles are explicitly computed, with knowledge of the local terrain slope and aspect as well as the sky position of the sensor. We investigate both the spatial organization of hillslope-scale brightness temperatures and the sensitivity of spatially aggregated brightness temperatures to satellite sky position. For one computational domain considered, hillslope-scale brightness temperatures vary from approximately 121 to 317 K in the horizontal polarization and from approximately 117 to 320 K in the vertical polarization. Including hillslope-scale heterogeneity in factors effecting emission can change watershed-aggregated brightness temperature by more than 2 K, depending on topographic ruggedness. These findings have implications for soil moisture data assimilation and disaggregation of brightness temperature observations to hillslope scales.