Estimates of fault slip rates are an integral part of assessing seismic hazard because they affect estimates of earthquake renewal and moment release rates. For some faults, however, slip rate estimates vary among geodetic studies or between geodetic and geologic investigations. These differences may reflect time-transient deformation, but they may also reflect differences in spatial scale of observations and in particular a lack of knowledge of how fault geometries change with depth. This study seeks to characterize the impact of common reverse fault geometries, such as ramp-flat features, on geodetic and geologic slip rate estimates using boundary element numerical models. The objective is to see how down-dip geometry affects slip rates throughout the seismic cycle and the ability to accurately estimate these rates when fault geometry is poorly known. A suite of two-dimensional models is used to explore ramp-flat geometry, commonly observed in reverse fault systems. The models are subjected to far-field loading representative of horizontal surface velocities recorded by GPS, and are designed to account for both interseismic and geologic timescales by coupling creeping segments of faults with either locked or unlocked segments. Using the forward models to explore the impact of down-dip geometry, we find that geodetic and geologic slip rate estimates do not always match. Inverting for slip rate, the mismatch becomes even greater when assuming fault geometries that are incorrect. The Longitudinal Valley Fault of Taiwan and the Ventura-Pitas Point fault of California provide two real-world case studies where thrust fault geometry is v incompletely known and, in the case of the Longitudinal Valley Fault, there is documented disagreement between geologic and geodetic slip rates. Models of these structures corroborate those from the idealized model suite, indicating that using correct fault geometries is important for reconciling differences between geodetic and geologic investigations. HIGHLIGHTS This study seeks to explain differences between fault slip rate estimates derived from geologic and geodetic techniques and among geodetic techniques. Geologic techniques stem from fieldwork and measurements of slip proxies at the surface, whereas geodetic techniques, including GPS, collect surface displacement data and use it to model subsurface fault displacements. Reconciling this discrepancy and correctly estimating fault slip rates are fundamental to quantifying earthquake risk. Our hypothesis is that discrepancies between slip rate estimates from geologic and geodetic techniques arise, at least partially, from using incorrect fault geometries when performing geodetic inversions. The hypothesis is supported with the caveat that use of incorrect geometries does not explain the entire discrepancy, indicating that there may also be multiple other factors at play.